Dec 02 09:20:36 crc systemd[1]: Starting Kubernetes Kubelet... Dec 02 09:20:36 crc restorecon[4688]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:36 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 09:20:37 crc restorecon[4688]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 09:20:37 crc restorecon[4688]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 02 09:20:37 crc kubenswrapper[4781]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 09:20:37 crc kubenswrapper[4781]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 02 09:20:37 crc kubenswrapper[4781]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 09:20:37 crc kubenswrapper[4781]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 09:20:37 crc kubenswrapper[4781]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 02 09:20:37 crc kubenswrapper[4781]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.285974 4781 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.289893 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.289935 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.289942 4781 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.289947 4781 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.289953 4781 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.289958 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.289963 4781 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.289969 4781 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.289979 4781 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.289984 4781 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.289989 4781 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.289994 4781 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.289999 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290004 4781 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290008 4781 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290013 4781 feature_gate.go:330] unrecognized feature gate: Example Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290017 4781 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290022 4781 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290026 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290030 4781 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290035 4781 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290039 4781 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290043 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290048 4781 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290052 4781 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290057 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290063 4781 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290070 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290075 4781 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290309 4781 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290318 4781 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290632 4781 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290637 4781 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290642 4781 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290646 4781 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290651 4781 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290655 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290659 4781 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290663 4781 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290670 4781 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290675 4781 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290686 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290690 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290696 4781 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290700 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290704 4781 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290710 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290715 4781 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290720 4781 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290725 4781 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290730 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290734 4781 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290742 4781 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290747 4781 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290750 4781 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290756 4781 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290759 4781 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290763 4781 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290767 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290810 4781 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290815 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290820 4781 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290824 4781 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290832 4781 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290836 4781 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290840 4781 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290844 4781 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290848 4781 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290851 4781 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290855 4781 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.290858 4781 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291209 4781 flags.go:64] FLAG: --address="0.0.0.0" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291229 4781 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291240 4781 flags.go:64] FLAG: --anonymous-auth="true" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291246 4781 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291252 4781 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291257 4781 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291263 4781 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291272 4781 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291277 4781 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291286 4781 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291329 4781 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291333 4781 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291338 4781 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291342 4781 flags.go:64] FLAG: --cgroup-root="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291347 4781 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291351 4781 flags.go:64] FLAG: --client-ca-file="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291355 4781 flags.go:64] FLAG: --cloud-config="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291363 4781 flags.go:64] FLAG: --cloud-provider="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291367 4781 flags.go:64] FLAG: --cluster-dns="[]" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291372 4781 flags.go:64] FLAG: --cluster-domain="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291376 4781 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291381 4781 flags.go:64] FLAG: --config-dir="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291386 4781 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291391 4781 flags.go:64] FLAG: --container-log-max-files="5" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291397 4781 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291402 4781 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291411 4781 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291415 4781 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291419 4781 flags.go:64] FLAG: --contention-profiling="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291424 4781 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291428 4781 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291433 4781 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291437 4781 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291442 4781 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291451 4781 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291456 4781 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291460 4781 flags.go:64] FLAG: --enable-load-reader="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291465 4781 flags.go:64] FLAG: --enable-server="true" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291470 4781 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291476 4781 flags.go:64] FLAG: --event-burst="100" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291480 4781 flags.go:64] FLAG: --event-qps="50" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291485 4781 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291490 4781 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291497 4781 flags.go:64] FLAG: --eviction-hard="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291503 4781 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291507 4781 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291511 4781 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291515 4781 flags.go:64] FLAG: --eviction-soft="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291520 4781 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291524 4781 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291528 4781 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291535 4781 flags.go:64] FLAG: --experimental-mounter-path="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291539 4781 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291544 4781 flags.go:64] FLAG: --fail-swap-on="true" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291549 4781 flags.go:64] FLAG: --feature-gates="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291554 4781 flags.go:64] FLAG: --file-check-frequency="20s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291560 4781 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291564 4781 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291569 4781 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291577 4781 flags.go:64] FLAG: --healthz-port="10248" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291582 4781 flags.go:64] FLAG: --help="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291586 4781 flags.go:64] FLAG: --hostname-override="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291591 4781 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291595 4781 flags.go:64] FLAG: --http-check-frequency="20s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291599 4781 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291604 4781 flags.go:64] FLAG: --image-credential-provider-config="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291608 4781 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291612 4781 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291621 4781 flags.go:64] FLAG: --image-service-endpoint="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291625 4781 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291631 4781 flags.go:64] FLAG: --kube-api-burst="100" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291636 4781 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291640 4781 flags.go:64] FLAG: --kube-api-qps="50" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291645 4781 flags.go:64] FLAG: --kube-reserved="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291653 4781 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291658 4781 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291662 4781 flags.go:64] FLAG: --kubelet-cgroups="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291667 4781 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291672 4781 flags.go:64] FLAG: --lock-file="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291676 4781 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291680 4781 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291687 4781 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291694 4781 flags.go:64] FLAG: --log-json-split-stream="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291701 4781 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291705 4781 flags.go:64] FLAG: --log-text-split-stream="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291709 4781 flags.go:64] FLAG: --logging-format="text" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291714 4781 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291719 4781 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291723 4781 flags.go:64] FLAG: --manifest-url="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291728 4781 flags.go:64] FLAG: --manifest-url-header="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291735 4781 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291742 4781 flags.go:64] FLAG: --max-open-files="1000000" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291748 4781 flags.go:64] FLAG: --max-pods="110" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291753 4781 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291758 4781 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291762 4781 flags.go:64] FLAG: --memory-manager-policy="None" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291768 4781 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291773 4781 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291779 4781 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291789 4781 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291919 4781 flags.go:64] FLAG: --node-status-max-images="50" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291945 4781 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.291972 4781 flags.go:64] FLAG: --oom-score-adj="-999" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292183 4781 flags.go:64] FLAG: --pod-cidr="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292227 4781 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292246 4781 flags.go:64] FLAG: --pod-manifest-path="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292256 4781 flags.go:64] FLAG: --pod-max-pids="-1" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292269 4781 flags.go:64] FLAG: --pods-per-core="0" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292279 4781 flags.go:64] FLAG: --port="10250" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292289 4781 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292299 4781 flags.go:64] FLAG: --provider-id="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292309 4781 flags.go:64] FLAG: --qos-reserved="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292320 4781 flags.go:64] FLAG: --read-only-port="10255" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292330 4781 flags.go:64] FLAG: --register-node="true" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292341 4781 flags.go:64] FLAG: --register-schedulable="true" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292351 4781 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292403 4781 flags.go:64] FLAG: --registry-burst="10" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292413 4781 flags.go:64] FLAG: --registry-qps="5" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292422 4781 flags.go:64] FLAG: --reserved-cpus="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292432 4781 flags.go:64] FLAG: --reserved-memory="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292444 4781 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292454 4781 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292463 4781 flags.go:64] FLAG: --rotate-certificates="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292473 4781 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292481 4781 flags.go:64] FLAG: --runonce="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292490 4781 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292500 4781 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292511 4781 flags.go:64] FLAG: --seccomp-default="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292520 4781 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292530 4781 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292540 4781 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292550 4781 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292559 4781 flags.go:64] FLAG: --storage-driver-password="root" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292568 4781 flags.go:64] FLAG: --storage-driver-secure="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292577 4781 flags.go:64] FLAG: --storage-driver-table="stats" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292588 4781 flags.go:64] FLAG: --storage-driver-user="root" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292597 4781 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292606 4781 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292616 4781 flags.go:64] FLAG: --system-cgroups="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292624 4781 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292644 4781 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292653 4781 flags.go:64] FLAG: --tls-cert-file="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292662 4781 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292676 4781 flags.go:64] FLAG: --tls-min-version="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292685 4781 flags.go:64] FLAG: --tls-private-key-file="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292724 4781 flags.go:64] FLAG: --topology-manager-policy="none" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292733 4781 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292742 4781 flags.go:64] FLAG: --topology-manager-scope="container" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292751 4781 flags.go:64] FLAG: --v="2" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292783 4781 flags.go:64] FLAG: --version="false" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292799 4781 flags.go:64] FLAG: --vmodule="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292813 4781 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.292826 4781 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293187 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293207 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293216 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293224 4781 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293232 4781 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293243 4781 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293255 4781 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293265 4781 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293277 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293285 4781 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293294 4781 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293303 4781 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293311 4781 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293318 4781 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293328 4781 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293336 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293347 4781 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293355 4781 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293363 4781 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293370 4781 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293378 4781 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293385 4781 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293393 4781 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293401 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293409 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293416 4781 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293424 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293432 4781 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293439 4781 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293446 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293454 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293462 4781 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293477 4781 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293485 4781 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293492 4781 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293499 4781 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293507 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293517 4781 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293527 4781 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293536 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293544 4781 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293553 4781 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293561 4781 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293569 4781 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293577 4781 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293584 4781 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293592 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293599 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293607 4781 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293615 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293623 4781 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293631 4781 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293639 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293647 4781 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293655 4781 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293663 4781 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293673 4781 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293681 4781 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293690 4781 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293697 4781 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293705 4781 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293712 4781 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293720 4781 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293728 4781 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293738 4781 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293746 4781 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293754 4781 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293761 4781 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293768 4781 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293776 4781 feature_gate.go:330] unrecognized feature gate: Example Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.293784 4781 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.293796 4781 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.305409 4781 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.305453 4781 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305580 4781 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305602 4781 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305612 4781 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305622 4781 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305631 4781 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305639 4781 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305647 4781 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305656 4781 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305664 4781 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305672 4781 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305680 4781 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305688 4781 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305696 4781 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305704 4781 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305712 4781 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305720 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305728 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305736 4781 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305744 4781 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305754 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305765 4781 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305777 4781 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305787 4781 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305795 4781 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305806 4781 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305816 4781 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305827 4781 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305836 4781 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305845 4781 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305853 4781 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305863 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305874 4781 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305884 4781 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305894 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305904 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305914 4781 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305949 4781 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305958 4781 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305966 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305974 4781 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305982 4781 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305990 4781 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.305997 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306005 4781 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306016 4781 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306027 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306036 4781 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306045 4781 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306055 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306064 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306073 4781 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306081 4781 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306090 4781 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306098 4781 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306107 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306118 4781 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306126 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306133 4781 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306141 4781 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306150 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306157 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306165 4781 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306173 4781 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306180 4781 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306189 4781 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306196 4781 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306204 4781 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306212 4781 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306219 4781 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306227 4781 feature_gate.go:330] unrecognized feature gate: Example Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306235 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.306248 4781 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306530 4781 feature_gate.go:330] unrecognized feature gate: Example Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306545 4781 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306554 4781 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306562 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306571 4781 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306579 4781 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306588 4781 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306597 4781 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306605 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306613 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306621 4781 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306629 4781 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306638 4781 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306645 4781 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306653 4781 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306661 4781 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306669 4781 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306676 4781 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306684 4781 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306693 4781 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306701 4781 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306708 4781 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306717 4781 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306725 4781 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306732 4781 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306740 4781 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306747 4781 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306755 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306763 4781 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306771 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306779 4781 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306786 4781 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306794 4781 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306802 4781 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306810 4781 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306818 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306825 4781 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306833 4781 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306841 4781 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306851 4781 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306863 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306874 4781 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306888 4781 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306899 4781 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306911 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306944 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306953 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306961 4781 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306971 4781 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306981 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306991 4781 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.306999 4781 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307007 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307015 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307023 4781 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307032 4781 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307039 4781 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307047 4781 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307055 4781 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307063 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307070 4781 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307078 4781 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307085 4781 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307093 4781 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307103 4781 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307111 4781 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307119 4781 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307127 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307134 4781 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307142 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.307150 4781 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.307162 4781 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.307413 4781 server.go:940] "Client rotation is on, will bootstrap in background" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.313560 4781 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.313700 4781 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.314550 4781 server.go:997] "Starting client certificate rotation" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.314587 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.314861 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-15 03:10:35.791547056 +0000 UTC Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.314979 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.322365 4781 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 09:20:37 crc kubenswrapper[4781]: E1202 09:20:37.323956 4781 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.324982 4781 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.332860 4781 log.go:25] "Validated CRI v1 runtime API" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.357554 4781 log.go:25] "Validated CRI v1 image API" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.359350 4781 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.361278 4781 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-02-09-16-35-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.361300 4781 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.375990 4781 manager.go:217] Machine: {Timestamp:2025-12-02 09:20:37.374736744 +0000 UTC m=+0.198610653 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:c8f9d245-0a2e-447a-a09e-ff80f79ba02f BootID:a8aab02d-934e-4a61-8d03-a223ac62150b Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:97:90:90 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:97:90:90 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:4d:04:9d Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:ea:88:a8 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:f6:68:3b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:a9:56:51 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:72:41:2a:dc:7f:44 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:7a:51:d0:de:1e:26 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.376223 4781 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.376430 4781 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.376885 4781 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.377141 4781 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.377181 4781 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.377396 4781 topology_manager.go:138] "Creating topology manager with none policy" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.377406 4781 container_manager_linux.go:303] "Creating device plugin manager" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.377554 4781 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.377577 4781 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.377750 4781 state_mem.go:36] "Initialized new in-memory state store" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.377826 4781 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.378537 4781 kubelet.go:418] "Attempting to sync node with API server" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.378556 4781 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.378578 4781 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.378590 4781 kubelet.go:324] "Adding apiserver pod source" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.378601 4781 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.380324 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Dec 02 09:20:37 crc kubenswrapper[4781]: E1202 09:20:37.380404 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.380480 4781 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.380661 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Dec 02 09:20:37 crc kubenswrapper[4781]: E1202 09:20:37.380768 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.381131 4781 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.382898 4781 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.383985 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.384048 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.384077 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.384098 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.384130 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.384165 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.384185 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.384217 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.384241 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.384261 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.384337 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.384365 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.384690 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.385501 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.385833 4781 server.go:1280] "Started kubelet" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.386127 4781 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.387950 4781 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.387987 4781 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.388079 4781 server.go:460] "Adding debug handlers to kubelet server" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.388060 4781 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 14:15:56.001015349 +0000 UTC Dec 02 09:20:37 crc systemd[1]: Started Kubernetes Kubelet. Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.387889 4781 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 02 09:20:37 crc kubenswrapper[4781]: E1202 09:20:37.389632 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.390051 4781 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 02 09:20:37 crc kubenswrapper[4781]: E1202 09:20:37.390238 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="200ms" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.390522 4781 factory.go:55] Registering systemd factory Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.390500 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.390548 4781 factory.go:221] Registration of the systemd container factory successfully Dec 02 09:20:37 crc kubenswrapper[4781]: E1202 09:20:37.390565 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.391148 4781 factory.go:153] Registering CRI-O factory Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.391214 4781 factory.go:221] Registration of the crio container factory successfully Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.391333 4781 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.390151 4781 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.391527 4781 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.391701 4781 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.391465 4781 factory.go:103] Registering Raw factory Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.392093 4781 manager.go:1196] Started watching for new ooms in manager Dec 02 09:20:37 crc kubenswrapper[4781]: E1202 09:20:37.391064 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d5b80b7256a60 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 09:20:37.38572656 +0000 UTC m=+0.209600479,LastTimestamp:2025-12-02 09:20:37.38572656 +0000 UTC m=+0.209600479,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.394532 4781 manager.go:319] Starting recovery of all containers Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401585 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401667 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401684 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401696 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401716 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401728 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401738 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401751 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401765 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401776 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401786 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401798 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401808 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401821 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401833 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401848 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401860 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401871 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401883 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401896 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401907 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401944 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401957 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401969 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401982 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.401996 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402014 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402028 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402042 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402054 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402066 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402081 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402095 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402127 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402142 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402154 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402167 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402179 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402192 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402204 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402216 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402229 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402242 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402256 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402270 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402282 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402295 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402309 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402323 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402335 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402350 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402362 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402381 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402393 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402407 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402421 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402435 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402447 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402459 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402471 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402482 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402493 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402504 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402518 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402529 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402541 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402549 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402559 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402570 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402581 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402592 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402603 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402613 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402624 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402634 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402645 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402656 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402668 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402679 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402694 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402705 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402718 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402729 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402764 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402778 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402791 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402803 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402815 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402829 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402840 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402852 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402865 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402876 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402887 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402899 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402912 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402942 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402954 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402965 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402977 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.402988 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403000 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403011 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403022 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403037 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403049 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403061 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403072 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403084 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403096 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403107 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403120 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403132 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403144 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403158 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403168 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403179 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403190 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403201 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403216 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403226 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403236 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403246 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403256 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403266 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403277 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403286 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403298 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403308 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403319 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403331 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403344 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403356 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403368 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403379 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403389 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403399 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403408 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403419 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403430 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403440 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403459 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403470 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403483 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403496 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403507 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403519 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403531 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403543 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403559 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403573 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403588 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403600 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403614 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403627 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403640 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403653 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403667 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403678 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403692 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403704 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403716 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403727 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403739 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403751 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403762 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403772 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403782 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403793 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.403805 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407181 4781 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407207 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407220 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407232 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407245 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407256 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407265 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407276 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407286 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407295 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407308 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407319 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407331 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407342 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407351 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407361 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407370 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407380 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407389 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407398 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407408 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407419 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407428 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407438 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407447 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407458 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407467 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407476 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407485 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407494 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407503 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407513 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407523 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407533 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407542 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407552 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407561 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407572 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407580 4781 reconstruct.go:97] "Volume reconstruction finished" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.407587 4781 reconciler.go:26] "Reconciler: start to sync state" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.417517 4781 manager.go:324] Recovery completed Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.425392 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.426895 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.426976 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.426992 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.427864 4781 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.427941 4781 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.427967 4781 state_mem.go:36] "Initialized new in-memory state store" Dec 02 09:20:37 crc kubenswrapper[4781]: E1202 09:20:37.490820 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.496654 4781 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.498303 4781 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.498347 4781 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.498374 4781 kubelet.go:2335] "Starting kubelet main sync loop" Dec 02 09:20:37 crc kubenswrapper[4781]: E1202 09:20:37.498539 4781 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.500783 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Dec 02 09:20:37 crc kubenswrapper[4781]: E1202 09:20:37.500878 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.510503 4781 policy_none.go:49] "None policy: Start" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.511691 4781 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.511728 4781 state_mem.go:35] "Initializing new in-memory state store" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.569121 4781 manager.go:334] "Starting Device Plugin manager" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.569486 4781 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.569499 4781 server.go:79] "Starting device plugin registration server" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.569908 4781 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.569944 4781 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.570228 4781 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.570321 4781 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.570333 4781 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 02 09:20:37 crc kubenswrapper[4781]: E1202 09:20:37.578300 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 09:20:37 crc kubenswrapper[4781]: E1202 09:20:37.591248 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="400ms" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.599367 4781 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.599476 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.600702 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.600748 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.600760 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.600967 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.601234 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.601290 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.601723 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.601748 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.601757 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.601887 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.601989 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.602023 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.602034 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.602254 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.602289 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.602728 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.602769 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.602778 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.602864 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.603041 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.603069 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.603334 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.603353 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.603360 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.603716 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.603747 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.603760 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.603717 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.603786 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.603795 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.603894 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.604000 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.604028 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.604999 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.605041 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.605051 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.605142 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.605160 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.605169 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.605221 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.605263 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.605984 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.606026 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.606036 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.670316 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.672092 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.672126 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.672135 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.672157 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 09:20:37 crc kubenswrapper[4781]: E1202 09:20:37.672772 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.713465 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.713510 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.713532 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.713553 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.713576 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.713597 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.713649 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.713694 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.713766 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.713851 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.713897 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.714006 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.714076 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.714107 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.714128 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.815180 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.815550 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.815579 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.815628 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.815630 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.815650 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.815734 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.815749 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.815748 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.815821 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.815758 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.816137 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.816206 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.816472 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.816595 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.816664 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.816723 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.816778 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.816820 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.816875 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.816960 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.817014 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.817032 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.817036 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.817070 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.817119 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.817116 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.817165 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.817164 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.817206 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.873828 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.875475 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.875526 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.875535 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.875559 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 09:20:37 crc kubenswrapper[4781]: E1202 09:20:37.876009 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.942471 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.964167 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.973502 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.986162 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-52860de04b93c856618447865a1bbe89ca726b6383163c46d6dc4d90cc796930 WatchSource:0}: Error finding container 52860de04b93c856618447865a1bbe89ca726b6383163c46d6dc4d90cc796930: Status 404 returned error can't find the container with id 52860de04b93c856618447865a1bbe89ca726b6383163c46d6dc4d90cc796930 Dec 02 09:20:37 crc kubenswrapper[4781]: E1202 09:20:37.992356 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="800ms" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.993320 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: I1202 09:20:37.997474 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 09:20:37 crc kubenswrapper[4781]: W1202 09:20:37.998799 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a575e548f3f71f3e7fb2be708385f96e37d90778ce0a798f66ef0fc7950e6e1c WatchSource:0}: Error finding container a575e548f3f71f3e7fb2be708385f96e37d90778ce0a798f66ef0fc7950e6e1c: Status 404 returned error can't find the container with id a575e548f3f71f3e7fb2be708385f96e37d90778ce0a798f66ef0fc7950e6e1c Dec 02 09:20:38 crc kubenswrapper[4781]: W1202 09:20:38.001258 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-5da36a0bcbf210217f93520a78ee579295c30597584189eed973966fb1a6dbfe WatchSource:0}: Error finding container 5da36a0bcbf210217f93520a78ee579295c30597584189eed973966fb1a6dbfe: Status 404 returned error can't find the container with id 5da36a0bcbf210217f93520a78ee579295c30597584189eed973966fb1a6dbfe Dec 02 09:20:38 crc kubenswrapper[4781]: W1202 09:20:38.009985 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-281c7a9da2c78ff0ae094919ab326cb2da3e5ae49a8bf1ac42690e576636e0d6 WatchSource:0}: Error finding container 281c7a9da2c78ff0ae094919ab326cb2da3e5ae49a8bf1ac42690e576636e0d6: Status 404 returned error can't find the container with id 281c7a9da2c78ff0ae094919ab326cb2da3e5ae49a8bf1ac42690e576636e0d6 Dec 02 09:20:38 crc kubenswrapper[4781]: W1202 09:20:38.024043 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-0251d6e080e5a314ad23ba4db190b684dbe1a05d83090b8519d16f4df0a070c1 WatchSource:0}: Error finding container 0251d6e080e5a314ad23ba4db190b684dbe1a05d83090b8519d16f4df0a070c1: Status 404 returned error can't find the container with id 0251d6e080e5a314ad23ba4db190b684dbe1a05d83090b8519d16f4df0a070c1 Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.277008 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.279459 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.279510 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.279525 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.279554 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 09:20:38 crc kubenswrapper[4781]: E1202 09:20:38.280151 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.386654 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.388901 4781 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 11:32:27.91532256 +0000 UTC Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.389014 4781 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 98h11m49.526313535s for next certificate rotation Dec 02 09:20:38 crc kubenswrapper[4781]: W1202 09:20:38.498066 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Dec 02 09:20:38 crc kubenswrapper[4781]: E1202 09:20:38.498168 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.506099 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34" exitCode=0 Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.506184 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34"} Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.506296 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5da36a0bcbf210217f93520a78ee579295c30597584189eed973966fb1a6dbfe"} Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.506402 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.507432 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.507462 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.507472 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.509393 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930"} Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.509426 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a575e548f3f71f3e7fb2be708385f96e37d90778ce0a798f66ef0fc7950e6e1c"} Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.510032 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.510886 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.510943 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.510958 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.511074 4781 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0" exitCode=0 Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.511161 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0"} Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.511218 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"52860de04b93c856618447865a1bbe89ca726b6383163c46d6dc4d90cc796930"} Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.511333 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.512061 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.512078 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.512086 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.513256 4781 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e9333a179840139fdbef9b921af14e8030141b9412cda7431466f7e74770b17f" exitCode=0 Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.513310 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e9333a179840139fdbef9b921af14e8030141b9412cda7431466f7e74770b17f"} Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.513326 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0251d6e080e5a314ad23ba4db190b684dbe1a05d83090b8519d16f4df0a070c1"} Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.513368 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.513968 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.513992 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.514001 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.517813 4781 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc" exitCode=0 Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.517880 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc"} Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.517910 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"281c7a9da2c78ff0ae094919ab326cb2da3e5ae49a8bf1ac42690e576636e0d6"} Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.518044 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.519611 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.519719 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:38 crc kubenswrapper[4781]: I1202 09:20:38.519746 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:38 crc kubenswrapper[4781]: W1202 09:20:38.568123 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Dec 02 09:20:38 crc kubenswrapper[4781]: E1202 09:20:38.568208 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Dec 02 09:20:38 crc kubenswrapper[4781]: W1202 09:20:38.670212 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Dec 02 09:20:38 crc kubenswrapper[4781]: E1202 09:20:38.670322 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Dec 02 09:20:38 crc kubenswrapper[4781]: W1202 09:20:38.687154 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Dec 02 09:20:38 crc kubenswrapper[4781]: E1202 09:20:38.687212 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Dec 02 09:20:38 crc kubenswrapper[4781]: E1202 09:20:38.792842 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="1.6s" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.081024 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.082627 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.082656 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.082665 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.082684 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 09:20:39 crc kubenswrapper[4781]: E1202 09:20:39.083122 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Dec 02 09:20:39 crc kubenswrapper[4781]: E1202 09:20:39.110441 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d5b80b7256a60 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 09:20:37.38572656 +0000 UTC m=+0.209600479,LastTimestamp:2025-12-02 09:20:37.38572656 +0000 UTC m=+0.209600479,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.332613 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 02 09:20:39 crc kubenswrapper[4781]: E1202 09:20:39.333598 4781 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.387008 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.521589 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc"} Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.521953 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.521940 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee"} Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.522025 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac"} Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.523534 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.523564 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.523580 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.525446 4781 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67" exitCode=0 Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.525543 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67"} Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.525724 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.526892 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.526914 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.526935 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.526917 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0832cd774d62295ea9f634d660162043eeec7c36ff962f38fe35b8657c5a62f8"} Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.527189 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.527910 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.527963 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.527973 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.529131 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5e25c4d1777f7fbfc56d52dac0ec8b209807c486a294c0e7062a49e58edb520b"} Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.529171 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c426b8000b17c761c86ae3152bb08084f63533d41a01fa039f84f0a74b89499e"} Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.529188 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5775ca067ce9fccaf09f228bfcf365fd53919bf4693f5dd7943183d1f14177b6"} Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.529273 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.530282 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.530320 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.530331 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.536411 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8"} Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.536441 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719"} Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.536451 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb"} Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.536460 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d"} Dec 02 09:20:39 crc kubenswrapper[4781]: I1202 09:20:39.683074 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.541444 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1e4ac81bbe6c15c4abc298de57f6a660b436587517adc7398afad02d8a189766"} Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.541555 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.542443 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.542473 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.542482 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.544210 4781 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e" exitCode=0 Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.544276 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e"} Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.544282 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.544355 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.544360 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.545244 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.545278 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.545290 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.545291 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.545315 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.545332 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.545346 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.545332 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.545356 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.683827 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.685529 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.685558 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.685566 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:40 crc kubenswrapper[4781]: I1202 09:20:40.685597 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 09:20:41 crc kubenswrapper[4781]: I1202 09:20:41.550619 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4"} Dec 02 09:20:41 crc kubenswrapper[4781]: I1202 09:20:41.550668 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 09:20:41 crc kubenswrapper[4781]: I1202 09:20:41.550689 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca"} Dec 02 09:20:41 crc kubenswrapper[4781]: I1202 09:20:41.550704 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d"} Dec 02 09:20:41 crc kubenswrapper[4781]: I1202 09:20:41.550715 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e"} Dec 02 09:20:41 crc kubenswrapper[4781]: I1202 09:20:41.550720 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:41 crc kubenswrapper[4781]: I1202 09:20:41.550726 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77"} Dec 02 09:20:41 crc kubenswrapper[4781]: I1202 09:20:41.550805 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:41 crc kubenswrapper[4781]: I1202 09:20:41.551621 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:41 crc kubenswrapper[4781]: I1202 09:20:41.551643 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:41 crc kubenswrapper[4781]: I1202 09:20:41.551651 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:41 crc kubenswrapper[4781]: I1202 09:20:41.551683 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:41 crc kubenswrapper[4781]: I1202 09:20:41.551719 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:41 crc kubenswrapper[4781]: I1202 09:20:41.551732 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:42 crc kubenswrapper[4781]: I1202 09:20:42.115410 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:42 crc kubenswrapper[4781]: I1202 09:20:42.283324 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:42 crc kubenswrapper[4781]: I1202 09:20:42.552423 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:42 crc kubenswrapper[4781]: I1202 09:20:42.552423 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:42 crc kubenswrapper[4781]: I1202 09:20:42.553493 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:42 crc kubenswrapper[4781]: I1202 09:20:42.553526 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:42 crc kubenswrapper[4781]: I1202 09:20:42.553546 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:42 crc kubenswrapper[4781]: I1202 09:20:42.553576 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:42 crc kubenswrapper[4781]: I1202 09:20:42.553594 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:42 crc kubenswrapper[4781]: I1202 09:20:42.553605 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:43 crc kubenswrapper[4781]: I1202 09:20:43.435951 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 02 09:20:43 crc kubenswrapper[4781]: I1202 09:20:43.554181 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:43 crc kubenswrapper[4781]: I1202 09:20:43.555087 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:43 crc kubenswrapper[4781]: I1202 09:20:43.555126 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:43 crc kubenswrapper[4781]: I1202 09:20:43.555137 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:43 crc kubenswrapper[4781]: I1202 09:20:43.779780 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 02 09:20:43 crc kubenswrapper[4781]: I1202 09:20:43.779960 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:43 crc kubenswrapper[4781]: I1202 09:20:43.781052 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:43 crc kubenswrapper[4781]: I1202 09:20:43.781091 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:43 crc kubenswrapper[4781]: I1202 09:20:43.781116 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:44 crc kubenswrapper[4781]: I1202 09:20:44.241521 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:44 crc kubenswrapper[4781]: I1202 09:20:44.243201 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:20:44 crc kubenswrapper[4781]: I1202 09:20:44.243319 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:44 crc kubenswrapper[4781]: I1202 09:20:44.244302 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:44 crc kubenswrapper[4781]: I1202 09:20:44.244341 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:44 crc kubenswrapper[4781]: I1202 09:20:44.244352 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:44 crc kubenswrapper[4781]: I1202 09:20:44.556131 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:44 crc kubenswrapper[4781]: I1202 09:20:44.556856 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:44 crc kubenswrapper[4781]: I1202 09:20:44.556880 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:44 crc kubenswrapper[4781]: I1202 09:20:44.556888 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:45 crc kubenswrapper[4781]: I1202 09:20:45.297970 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:20:45 crc kubenswrapper[4781]: I1202 09:20:45.298125 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:45 crc kubenswrapper[4781]: I1202 09:20:45.299723 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:45 crc kubenswrapper[4781]: I1202 09:20:45.299850 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:45 crc kubenswrapper[4781]: I1202 09:20:45.299951 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:46 crc kubenswrapper[4781]: I1202 09:20:46.142450 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:20:46 crc kubenswrapper[4781]: I1202 09:20:46.143162 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:46 crc kubenswrapper[4781]: I1202 09:20:46.144182 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:46 crc kubenswrapper[4781]: I1202 09:20:46.144223 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:46 crc kubenswrapper[4781]: I1202 09:20:46.144235 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:46 crc kubenswrapper[4781]: I1202 09:20:46.149283 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:20:46 crc kubenswrapper[4781]: I1202 09:20:46.561025 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:46 crc kubenswrapper[4781]: I1202 09:20:46.561117 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:20:46 crc kubenswrapper[4781]: I1202 09:20:46.561749 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:46 crc kubenswrapper[4781]: I1202 09:20:46.561778 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:46 crc kubenswrapper[4781]: I1202 09:20:46.561787 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:47 crc kubenswrapper[4781]: I1202 09:20:47.244296 4781 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 09:20:47 crc kubenswrapper[4781]: I1202 09:20:47.244437 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 09:20:47 crc kubenswrapper[4781]: I1202 09:20:47.563550 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:47 crc kubenswrapper[4781]: I1202 09:20:47.564371 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:47 crc kubenswrapper[4781]: I1202 09:20:47.564418 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:47 crc kubenswrapper[4781]: I1202 09:20:47.564432 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:47 crc kubenswrapper[4781]: E1202 09:20:47.578709 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 09:20:48 crc kubenswrapper[4781]: I1202 09:20:48.231009 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 02 09:20:48 crc kubenswrapper[4781]: I1202 09:20:48.231164 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:48 crc kubenswrapper[4781]: I1202 09:20:48.232025 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:48 crc kubenswrapper[4781]: I1202 09:20:48.232064 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:48 crc kubenswrapper[4781]: I1202 09:20:48.232076 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:49 crc kubenswrapper[4781]: I1202 09:20:49.165351 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:20:49 crc kubenswrapper[4781]: I1202 09:20:49.165531 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:49 crc kubenswrapper[4781]: I1202 09:20:49.166944 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:49 crc kubenswrapper[4781]: I1202 09:20:49.166985 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:49 crc kubenswrapper[4781]: I1202 09:20:49.166998 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:50 crc kubenswrapper[4781]: I1202 09:20:50.091484 4781 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37400->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 02 09:20:50 crc kubenswrapper[4781]: I1202 09:20:50.091567 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37400->192.168.126.11:17697: read: connection reset by peer" Dec 02 09:20:50 crc kubenswrapper[4781]: I1202 09:20:50.274232 4781 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 02 09:20:50 crc kubenswrapper[4781]: I1202 09:20:50.274293 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 02 09:20:50 crc kubenswrapper[4781]: I1202 09:20:50.278371 4781 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 02 09:20:50 crc kubenswrapper[4781]: I1202 09:20:50.278424 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 02 09:20:50 crc kubenswrapper[4781]: I1202 09:20:50.572239 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 09:20:50 crc kubenswrapper[4781]: I1202 09:20:50.574018 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1e4ac81bbe6c15c4abc298de57f6a660b436587517adc7398afad02d8a189766" exitCode=255 Dec 02 09:20:50 crc kubenswrapper[4781]: I1202 09:20:50.574053 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1e4ac81bbe6c15c4abc298de57f6a660b436587517adc7398afad02d8a189766"} Dec 02 09:20:50 crc kubenswrapper[4781]: I1202 09:20:50.574187 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:50 crc kubenswrapper[4781]: I1202 09:20:50.575117 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:50 crc kubenswrapper[4781]: I1202 09:20:50.575166 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:50 crc kubenswrapper[4781]: I1202 09:20:50.575181 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:50 crc kubenswrapper[4781]: I1202 09:20:50.575840 4781 scope.go:117] "RemoveContainer" containerID="1e4ac81bbe6c15c4abc298de57f6a660b436587517adc7398afad02d8a189766" Dec 02 09:20:51 crc kubenswrapper[4781]: I1202 09:20:51.578294 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 09:20:51 crc kubenswrapper[4781]: I1202 09:20:51.579876 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0"} Dec 02 09:20:51 crc kubenswrapper[4781]: I1202 09:20:51.580071 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:51 crc kubenswrapper[4781]: I1202 09:20:51.580759 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:51 crc kubenswrapper[4781]: I1202 09:20:51.580791 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:51 crc kubenswrapper[4781]: I1202 09:20:51.580802 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:52 crc kubenswrapper[4781]: I1202 09:20:52.116115 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:52 crc kubenswrapper[4781]: I1202 09:20:52.582298 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:52 crc kubenswrapper[4781]: I1202 09:20:52.583006 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:52 crc kubenswrapper[4781]: I1202 09:20:52.583077 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:52 crc kubenswrapper[4781]: I1202 09:20:52.583101 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:54 crc kubenswrapper[4781]: I1202 09:20:54.248258 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:54 crc kubenswrapper[4781]: I1202 09:20:54.248429 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:54 crc kubenswrapper[4781]: I1202 09:20:54.249625 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:54 crc kubenswrapper[4781]: I1202 09:20:54.249657 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:54 crc kubenswrapper[4781]: I1202 09:20:54.249666 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:54 crc kubenswrapper[4781]: I1202 09:20:54.254833 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:54 crc kubenswrapper[4781]: I1202 09:20:54.587032 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:54 crc kubenswrapper[4781]: I1202 09:20:54.587880 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:54 crc kubenswrapper[4781]: I1202 09:20:54.587904 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:54 crc kubenswrapper[4781]: I1202 09:20:54.587913 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.266754 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.268508 4781 trace.go:236] Trace[763619989]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 09:20:40.372) (total time: 14896ms): Dec 02 09:20:55 crc kubenswrapper[4781]: Trace[763619989]: ---"Objects listed" error: 14896ms (09:20:55.268) Dec 02 09:20:55 crc kubenswrapper[4781]: Trace[763619989]: [14.896088266s] [14.896088266s] END Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.268529 4781 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.271121 4781 trace.go:236] Trace[170577743]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 09:20:41.645) (total time: 13625ms): Dec 02 09:20:55 crc kubenswrapper[4781]: Trace[170577743]: ---"Objects listed" error: 13625ms (09:20:55.271) Dec 02 09:20:55 crc kubenswrapper[4781]: Trace[170577743]: [13.625691144s] [13.625691144s] END Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.271317 4781 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.272586 4781 trace.go:236] Trace[803509434]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 09:20:40.985) (total time: 14287ms): Dec 02 09:20:55 crc kubenswrapper[4781]: Trace[803509434]: ---"Objects listed" error: 14287ms (09:20:55.272) Dec 02 09:20:55 crc kubenswrapper[4781]: Trace[803509434]: [14.287505327s] [14.287505327s] END Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.272606 4781 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.272718 4781 trace.go:236] Trace[231572267]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 09:20:40.582) (total time: 14690ms): Dec 02 09:20:55 crc kubenswrapper[4781]: Trace[231572267]: ---"Objects listed" error: 14689ms (09:20:55.271) Dec 02 09:20:55 crc kubenswrapper[4781]: Trace[231572267]: [14.690213942s] [14.690213942s] END Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.272772 4781 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.273829 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.278881 4781 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.289554 4781 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.309725 4781 csr.go:261] certificate signing request csr-jfhx4 is approved, waiting to be issued Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.318517 4781 csr.go:257] certificate signing request csr-jfhx4 is issued Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.320477 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.325052 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.392257 4781 apiserver.go:52] "Watching apiserver" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.395139 4781 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.395394 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.395658 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.395706 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.395813 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.396065 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.396074 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.396268 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.396628 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.396688 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.396850 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.397498 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.397690 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.397974 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.398691 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.398810 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.398854 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.398997 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.399500 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.400722 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.418245 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.439766 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.459051 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.474159 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.483701 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.492396 4781 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.494171 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.504506 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.515984 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.526450 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.579697 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.579743 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.579758 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.579776 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.579794 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.579810 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.579827 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.579843 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.579859 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.579873 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.579910 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.579941 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.579954 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.579969 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.579983 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.579998 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580018 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580035 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580052 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580069 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580085 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580101 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580116 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580130 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580149 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580169 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580190 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580206 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580223 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580238 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580321 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580286 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580369 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580392 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580412 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580432 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580478 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580494 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580510 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580525 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580539 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580579 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580596 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580612 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580627 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580642 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580657 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580674 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580689 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580704 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580724 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580739 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580756 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580775 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580791 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580807 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580848 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580864 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580881 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580897 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580915 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580946 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580962 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580979 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580996 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581010 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581025 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581039 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581057 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581073 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581091 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581107 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581122 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581138 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581166 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581181 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581200 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581215 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581235 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581251 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581265 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581281 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581296 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581311 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581327 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581343 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581358 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581373 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581388 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581406 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581423 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581439 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581458 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581474 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581490 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581506 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581522 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581539 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581555 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581570 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581587 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581602 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581618 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581634 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581649 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581666 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581683 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581700 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581718 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581734 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581749 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581765 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581781 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581797 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581811 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581827 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581844 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581861 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581877 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581895 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581915 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581953 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581973 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581989 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582011 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582031 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582048 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582065 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582081 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582098 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582113 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582135 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582165 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582186 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582202 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582219 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582237 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582253 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582277 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582295 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582314 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582331 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582346 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582364 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582382 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582399 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582415 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582431 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582447 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582463 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582479 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582497 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582564 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582586 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582605 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582624 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582643 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582661 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582677 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582694 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582710 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582726 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582741 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582758 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582775 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582794 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582811 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582829 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582846 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582861 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582878 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582894 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582914 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583030 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583055 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583075 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583092 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583108 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583125 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583145 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583167 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583190 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583212 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583235 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583251 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583270 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583286 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583302 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583321 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583338 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583354 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583371 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583389 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583406 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583423 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583440 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583474 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583505 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583523 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583542 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583559 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583581 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583598 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583617 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583635 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583655 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583691 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583709 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583729 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583746 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583785 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580332 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580355 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580373 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580378 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580557 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580585 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580595 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580803 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.580979 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581017 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581073 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581143 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581167 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581278 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581283 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581356 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581394 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581415 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581424 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581473 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581633 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581642 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581682 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581704 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581707 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581831 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581867 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.581898 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582016 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582034 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582181 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582188 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582214 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582211 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582252 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582272 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582352 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582375 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582430 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582498 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582555 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582559 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582563 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582746 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582795 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582797 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582810 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582827 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582880 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.582913 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583022 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583042 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583318 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583339 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.583393 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.584964 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.585241 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.585350 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.585440 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.585455 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.585507 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.585611 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.585620 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.585783 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.585781 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.585786 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.586010 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.586066 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.586106 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.586202 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.586211 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.586241 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.586274 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.586468 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.586498 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.586584 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.586723 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.586912 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.587114 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.584520 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.587553 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.587585 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.587602 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.587618 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.587648 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.587673 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.588073 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.588237 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.588266 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.588445 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.588474 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.588543 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.588582 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.588603 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.588657 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.589712 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.589781 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.589736 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.589630 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.589968 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.590052 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.590010 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.590326 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.590708 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.590824 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.590977 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.591266 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.591358 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.591538 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.591711 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.591869 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.591955 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.591978 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.592145 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.592284 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.592322 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.592509 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.592555 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.592585 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.592693 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.593266 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.593385 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.592701 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.592715 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.592864 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.592985 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.593439 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.593479 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.593489 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.593640 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.593891 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.593982 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.594125 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.594281 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:20:56.094256629 +0000 UTC m=+18.918130528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.595527 4781 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.596725 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.596722 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.597308 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.597682 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.597734 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.598093 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.598788 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.598821 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.599133 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.599432 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.599598 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 09:20:56.099582059 +0000 UTC m=+18.923455948 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.599760 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.599866 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 09:20:56.099833768 +0000 UTC m=+18.923707657 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.600145 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.602836 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.604812 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.604869 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.604960 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.605951 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.608138 4781 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.609052 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.610623 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.611356 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.611487 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.613156 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.615614 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.615822 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.615902 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.616206 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.616323 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.616713 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.616983 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.617115 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.617150 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.617170 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.617361 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 09:20:56.117314358 +0000 UTC m=+18.941188327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.617663 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.617676 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.618679 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.618694 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:20:55 crc kubenswrapper[4781]: E1202 09:20:55.618754 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 09:20:56.118738917 +0000 UTC m=+18.942612786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.619351 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.619533 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.621149 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.623063 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.623106 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.623398 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.623636 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.624046 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.628690 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.631231 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.631544 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.631544 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.631719 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.631962 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.632620 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.632630 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.632798 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.632792 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.633187 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.633335 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.637537 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.637644 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.637797 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.638050 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.638115 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.637875 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.637916 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.638150 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.638376 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.637962 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.638021 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.638068 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.638481 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.638700 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.639535 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.640091 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.650548 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.650874 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.660960 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.671129 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.684760 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.684826 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.684889 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.684901 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.684910 4781 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.684945 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.684955 4781 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.684963 4781 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.684975 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.684995 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.684991 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685062 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685078 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685090 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685102 4781 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685113 4781 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685125 4781 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685136 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685147 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685158 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685169 4781 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685180 4781 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685190 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685201 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685213 4781 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685223 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685233 4781 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685242 4781 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685252 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685263 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685273 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685283 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685293 4781 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685303 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685314 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685325 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685335 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685345 4781 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685356 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685366 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685376 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685386 4781 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685397 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685407 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685418 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685427 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685438 4781 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685448 4781 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685461 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685471 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685481 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685492 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685502 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685513 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685524 4781 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685535 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685547 4781 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685558 4781 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685571 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685581 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685592 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685604 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685616 4781 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685627 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685638 4781 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685648 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685658 4781 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685668 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685678 4781 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685688 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685698 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685711 4781 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685723 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685735 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685746 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685755 4781 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685766 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685778 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685786 4781 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685796 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685805 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685815 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685823 4781 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685833 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685841 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685850 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685886 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685895 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685904 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.685912 4781 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686023 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686036 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686045 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686054 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686133 4781 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686145 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686156 4781 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686168 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686177 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686186 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686194 4781 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686203 4781 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686211 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686220 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686228 4781 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686236 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686246 4781 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686255 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686263 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686271 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686279 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686288 4781 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686296 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686304 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686313 4781 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686840 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686853 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686885 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686895 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686905 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686913 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686942 4781 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686951 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686961 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686971 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686979 4781 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.686988 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687020 4781 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687030 4781 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687057 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687066 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687075 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687083 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687092 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687143 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687156 4781 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687167 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687177 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687189 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687201 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687211 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687219 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687227 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687236 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687243 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687252 4781 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687260 4781 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687267 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687275 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687283 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687292 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687299 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687307 4781 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687316 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687325 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687343 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687377 4781 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687388 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687399 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687422 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687431 4781 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687439 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687448 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687456 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687465 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687474 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687483 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687492 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687501 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687510 4781 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687539 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687548 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687557 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687565 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687573 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687640 4781 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687649 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687658 4781 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687666 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687675 4781 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687688 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687705 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687718 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687730 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687742 4781 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687753 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687763 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687774 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687785 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687795 4781 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687806 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687817 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687827 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687838 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687848 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687858 4781 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.687869 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.708535 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.715475 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 09:20:55 crc kubenswrapper[4781]: W1202 09:20:55.722535 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-2da9483675d59303020fed38795a02f5599be2d67df7def414f1e24f4de6e2b3 WatchSource:0}: Error finding container 2da9483675d59303020fed38795a02f5599be2d67df7def414f1e24f4de6e2b3: Status 404 returned error can't find the container with id 2da9483675d59303020fed38795a02f5599be2d67df7def414f1e24f4de6e2b3 Dec 02 09:20:55 crc kubenswrapper[4781]: W1202 09:20:55.726670 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-466d12a29aa6f45c5d3927d5fa08b5141f2b640d1c5093c5705a1f8b11b8436c WatchSource:0}: Error finding container 466d12a29aa6f45c5d3927d5fa08b5141f2b640d1c5093c5705a1f8b11b8436c: Status 404 returned error can't find the container with id 466d12a29aa6f45c5d3927d5fa08b5141f2b640d1c5093c5705a1f8b11b8436c Dec 02 09:20:55 crc kubenswrapper[4781]: I1202 09:20:55.728196 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 09:20:55 crc kubenswrapper[4781]: W1202 09:20:55.742156 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-40b9be6aadb1798ecebecaab4a428c0aad778408f5d9851def09df12d0768604 WatchSource:0}: Error finding container 40b9be6aadb1798ecebecaab4a428c0aad778408f5d9851def09df12d0768604: Status 404 returned error can't find the container with id 40b9be6aadb1798ecebecaab4a428c0aad778408f5d9851def09df12d0768604 Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.191232 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.191309 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.191334 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.191375 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.191393 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:20:56 crc kubenswrapper[4781]: E1202 09:20:56.191458 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:20:57.191428157 +0000 UTC m=+20.015302046 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:20:56 crc kubenswrapper[4781]: E1202 09:20:56.191483 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 09:20:56 crc kubenswrapper[4781]: E1202 09:20:56.191509 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 09:20:56 crc kubenswrapper[4781]: E1202 09:20:56.191526 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 09:20:56 crc kubenswrapper[4781]: E1202 09:20:56.191540 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 09:20:56 crc kubenswrapper[4781]: E1202 09:20:56.191564 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 09:20:56 crc kubenswrapper[4781]: E1202 09:20:56.191577 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:20:56 crc kubenswrapper[4781]: E1202 09:20:56.191545 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:20:56 crc kubenswrapper[4781]: E1202 09:20:56.191483 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 09:20:56 crc kubenswrapper[4781]: E1202 09:20:56.191552 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 09:20:57.19153921 +0000 UTC m=+20.015413089 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 09:20:56 crc kubenswrapper[4781]: E1202 09:20:56.191711 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 09:20:57.191698056 +0000 UTC m=+20.015571975 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:20:56 crc kubenswrapper[4781]: E1202 09:20:56.191728 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 09:20:57.191719526 +0000 UTC m=+20.015593485 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:20:56 crc kubenswrapper[4781]: E1202 09:20:56.191742 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 09:20:57.191735487 +0000 UTC m=+20.015609456 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.319331 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-02 09:15:55 +0000 UTC, rotation deadline is 2026-10-25 17:10:44.490954495 +0000 UTC Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.319394 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7855h49m48.171563117s for next certificate rotation Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.498603 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:20:56 crc kubenswrapper[4781]: E1202 09:20:56.498732 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.498805 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:20:56 crc kubenswrapper[4781]: E1202 09:20:56.498873 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.597074 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"40b9be6aadb1798ecebecaab4a428c0aad778408f5d9851def09df12d0768604"} Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.599158 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9"} Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.599207 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a"} Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.599219 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"466d12a29aa6f45c5d3927d5fa08b5141f2b640d1c5093c5705a1f8b11b8436c"} Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.601870 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8"} Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.601913 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2da9483675d59303020fed38795a02f5599be2d67df7def414f1e24f4de6e2b3"} Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.615659 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-kgbn2"] Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.615937 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kgbn2" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.618495 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.618663 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.618770 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.619176 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:56Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.634251 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:56Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.645175 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:56Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.657663 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:56Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.671751 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:56Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.683887 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:56Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.704397 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:56Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.722557 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:56Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.735364 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:56Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.748143 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:56Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.759503 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:56Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.767855 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:56Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.776448 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:56Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.786550 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:56Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.797739 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:56Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.797873 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hxs7\" (UniqueName: \"kubernetes.io/projected/9440f972-ed59-4852-a180-3d5a2111f966-kube-api-access-9hxs7\") pod \"node-resolver-kgbn2\" (UID: \"9440f972-ed59-4852-a180-3d5a2111f966\") " pod="openshift-dns/node-resolver-kgbn2" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.798108 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9440f972-ed59-4852-a180-3d5a2111f966-hosts-file\") pod \"node-resolver-kgbn2\" (UID: \"9440f972-ed59-4852-a180-3d5a2111f966\") " pod="openshift-dns/node-resolver-kgbn2" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.898800 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9440f972-ed59-4852-a180-3d5a2111f966-hosts-file\") pod \"node-resolver-kgbn2\" (UID: \"9440f972-ed59-4852-a180-3d5a2111f966\") " pod="openshift-dns/node-resolver-kgbn2" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.899061 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hxs7\" (UniqueName: \"kubernetes.io/projected/9440f972-ed59-4852-a180-3d5a2111f966-kube-api-access-9hxs7\") pod \"node-resolver-kgbn2\" (UID: \"9440f972-ed59-4852-a180-3d5a2111f966\") " pod="openshift-dns/node-resolver-kgbn2" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.898989 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9440f972-ed59-4852-a180-3d5a2111f966-hosts-file\") pod \"node-resolver-kgbn2\" (UID: \"9440f972-ed59-4852-a180-3d5a2111f966\") " pod="openshift-dns/node-resolver-kgbn2" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.920484 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hxs7\" (UniqueName: \"kubernetes.io/projected/9440f972-ed59-4852-a180-3d5a2111f966-kube-api-access-9hxs7\") pod \"node-resolver-kgbn2\" (UID: \"9440f972-ed59-4852-a180-3d5a2111f966\") " pod="openshift-dns/node-resolver-kgbn2" Dec 02 09:20:56 crc kubenswrapper[4781]: I1202 09:20:56.931219 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kgbn2" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.009737 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-dnkgc"] Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.010236 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.010410 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-pzntm"] Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.010803 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.011892 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.012075 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.012160 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.012238 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.013449 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.013873 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.014090 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.014215 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.014360 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.015014 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.029617 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.043731 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.056548 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.071619 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.087941 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.113260 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.128085 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.141452 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.156465 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.165791 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.177131 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.188320 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.199188 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.200858 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.200941 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3aa44429-873b-48f6-bc13-55745827d8fa-cnibin\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.200963 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9svf\" (UniqueName: \"kubernetes.io/projected/3aa44429-873b-48f6-bc13-55745827d8fa-kube-api-access-q9svf\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.200989 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:20:57 crc kubenswrapper[4781]: E1202 09:20:57.201025 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:20:59.201000871 +0000 UTC m=+22.024874740 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.201064 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3aa44429-873b-48f6-bc13-55745827d8fa-os-release\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: E1202 09:20:57.201114 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 09:20:57 crc kubenswrapper[4781]: E1202 09:20:57.201135 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.201140 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bph4w\" (UniqueName: \"kubernetes.io/projected/e10258da-dad3-4df8-82c2-9d9438493a3d-kube-api-access-bph4w\") pod \"machine-config-daemon-pzntm\" (UID: \"e10258da-dad3-4df8-82c2-9d9438493a3d\") " pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:20:57 crc kubenswrapper[4781]: E1202 09:20:57.201149 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.201193 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:20:57 crc kubenswrapper[4781]: E1202 09:20:57.201200 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 09:20:59.201185587 +0000 UTC m=+22.025059456 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.201215 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3aa44429-873b-48f6-bc13-55745827d8fa-system-cni-dir\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.201230 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3aa44429-873b-48f6-bc13-55745827d8fa-cni-binary-copy\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.201254 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.201272 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3aa44429-873b-48f6-bc13-55745827d8fa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: E1202 09:20:57.201282 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.201289 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3aa44429-873b-48f6-bc13-55745827d8fa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: E1202 09:20:57.201295 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 09:20:57 crc kubenswrapper[4781]: E1202 09:20:57.201306 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:20:57 crc kubenswrapper[4781]: E1202 09:20:57.201331 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 09:20:59.201322881 +0000 UTC m=+22.025196760 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.201305 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e10258da-dad3-4df8-82c2-9d9438493a3d-mcd-auth-proxy-config\") pod \"machine-config-daemon-pzntm\" (UID: \"e10258da-dad3-4df8-82c2-9d9438493a3d\") " pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.201364 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.201382 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e10258da-dad3-4df8-82c2-9d9438493a3d-rootfs\") pod \"machine-config-daemon-pzntm\" (UID: \"e10258da-dad3-4df8-82c2-9d9438493a3d\") " pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.201401 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e10258da-dad3-4df8-82c2-9d9438493a3d-proxy-tls\") pod \"machine-config-daemon-pzntm\" (UID: \"e10258da-dad3-4df8-82c2-9d9438493a3d\") " pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:20:57 crc kubenswrapper[4781]: E1202 09:20:57.201428 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 09:20:57 crc kubenswrapper[4781]: E1202 09:20:57.201427 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 09:20:57 crc kubenswrapper[4781]: E1202 09:20:57.201467 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 09:20:59.201460366 +0000 UTC m=+22.025334245 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 09:20:57 crc kubenswrapper[4781]: E1202 09:20:57.201480 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 09:20:59.201472986 +0000 UTC m=+22.025346865 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.214659 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.224909 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.239268 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.251202 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.264974 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.275793 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.301989 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3aa44429-873b-48f6-bc13-55745827d8fa-system-cni-dir\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302032 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3aa44429-873b-48f6-bc13-55745827d8fa-cni-binary-copy\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302062 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3aa44429-873b-48f6-bc13-55745827d8fa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302089 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3aa44429-873b-48f6-bc13-55745827d8fa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302121 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e10258da-dad3-4df8-82c2-9d9438493a3d-mcd-auth-proxy-config\") pod \"machine-config-daemon-pzntm\" (UID: \"e10258da-dad3-4df8-82c2-9d9438493a3d\") " pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302130 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3aa44429-873b-48f6-bc13-55745827d8fa-system-cni-dir\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302143 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e10258da-dad3-4df8-82c2-9d9438493a3d-rootfs\") pod \"machine-config-daemon-pzntm\" (UID: \"e10258da-dad3-4df8-82c2-9d9438493a3d\") " pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302171 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e10258da-dad3-4df8-82c2-9d9438493a3d-proxy-tls\") pod \"machine-config-daemon-pzntm\" (UID: \"e10258da-dad3-4df8-82c2-9d9438493a3d\") " pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302192 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3aa44429-873b-48f6-bc13-55745827d8fa-cnibin\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302221 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3aa44429-873b-48f6-bc13-55745827d8fa-os-release\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302239 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9svf\" (UniqueName: \"kubernetes.io/projected/3aa44429-873b-48f6-bc13-55745827d8fa-kube-api-access-q9svf\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302268 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bph4w\" (UniqueName: \"kubernetes.io/projected/e10258da-dad3-4df8-82c2-9d9438493a3d-kube-api-access-bph4w\") pod \"machine-config-daemon-pzntm\" (UID: \"e10258da-dad3-4df8-82c2-9d9438493a3d\") " pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302326 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3aa44429-873b-48f6-bc13-55745827d8fa-cnibin\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302362 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e10258da-dad3-4df8-82c2-9d9438493a3d-rootfs\") pod \"machine-config-daemon-pzntm\" (UID: \"e10258da-dad3-4df8-82c2-9d9438493a3d\") " pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302552 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3aa44429-873b-48f6-bc13-55745827d8fa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302599 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3aa44429-873b-48f6-bc13-55745827d8fa-os-release\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302837 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3aa44429-873b-48f6-bc13-55745827d8fa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302845 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3aa44429-873b-48f6-bc13-55745827d8fa-cni-binary-copy\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.302912 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e10258da-dad3-4df8-82c2-9d9438493a3d-mcd-auth-proxy-config\") pod \"machine-config-daemon-pzntm\" (UID: \"e10258da-dad3-4df8-82c2-9d9438493a3d\") " pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.306038 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e10258da-dad3-4df8-82c2-9d9438493a3d-proxy-tls\") pod \"machine-config-daemon-pzntm\" (UID: \"e10258da-dad3-4df8-82c2-9d9438493a3d\") " pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.315561 4781 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 02 09:20:57 crc kubenswrapper[4781]: W1202 09:20:57.315727 4781 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 02 09:20:57 crc kubenswrapper[4781]: W1202 09:20:57.315770 4781 reflector.go:484] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": Unexpected watch close - watch lasted less than a second and no items received Dec 02 09:20:57 crc kubenswrapper[4781]: W1202 09:20:57.315839 4781 reflector.go:484] object-"openshift-multus"/"default-cni-sysctl-allowlist": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"default-cni-sysctl-allowlist": Unexpected watch close - watch lasted less than a second and no items received Dec 02 09:20:57 crc kubenswrapper[4781]: W1202 09:20:57.315837 4781 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: very short watch: object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": Unexpected watch close - watch lasted less than a second and no items received Dec 02 09:20:57 crc kubenswrapper[4781]: W1202 09:20:57.315866 4781 reflector.go:484] object-"openshift-machine-config-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 02 09:20:57 crc kubenswrapper[4781]: W1202 09:20:57.315866 4781 reflector.go:484] object-"openshift-multus"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 02 09:20:57 crc kubenswrapper[4781]: W1202 09:20:57.315881 4781 reflector.go:484] object-"openshift-machine-config-operator"/"kube-rbac-proxy": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-rbac-proxy": Unexpected watch close - watch lasted less than a second and no items received Dec 02 09:20:57 crc kubenswrapper[4781]: W1202 09:20:57.315883 4781 reflector.go:484] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 02 09:20:57 crc kubenswrapper[4781]: W1202 09:20:57.315727 4781 reflector.go:484] object-"openshift-multus"/"cni-copy-resources": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"cni-copy-resources": Unexpected watch close - watch lasted less than a second and no items received Dec 02 09:20:57 crc kubenswrapper[4781]: W1202 09:20:57.315966 4781 reflector.go:484] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": Unexpected watch close - watch lasted less than a second and no items received Dec 02 09:20:57 crc kubenswrapper[4781]: W1202 09:20:57.315980 4781 reflector.go:484] object-"openshift-multus"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 02 09:20:57 crc kubenswrapper[4781]: W1202 09:20:57.316012 4781 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 02 09:20:57 crc kubenswrapper[4781]: W1202 09:20:57.316043 4781 reflector.go:484] object-"openshift-machine-config-operator"/"proxy-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"proxy-tls": Unexpected watch close - watch lasted less than a second and no items received Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.316187 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9svf\" (UniqueName: \"kubernetes.io/projected/3aa44429-873b-48f6-bc13-55745827d8fa-kube-api-access-q9svf\") pod \"multus-additional-cni-plugins-dnkgc\" (UID: \"3aa44429-873b-48f6-bc13-55745827d8fa\") " pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.319876 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bph4w\" (UniqueName: \"kubernetes.io/projected/e10258da-dad3-4df8-82c2-9d9438493a3d-kube-api-access-bph4w\") pod \"machine-config-daemon-pzntm\" (UID: \"e10258da-dad3-4df8-82c2-9d9438493a3d\") " pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.328463 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.335664 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:20:57 crc kubenswrapper[4781]: W1202 09:20:57.337913 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aa44429_873b_48f6_bc13_55745827d8fa.slice/crio-4ab1add8dae69032f5231cf93729c1ea4f60759f17f6b5adbec4248f8637ba20 WatchSource:0}: Error finding container 4ab1add8dae69032f5231cf93729c1ea4f60759f17f6b5adbec4248f8637ba20: Status 404 returned error can't find the container with id 4ab1add8dae69032f5231cf93729c1ea4f60759f17f6b5adbec4248f8637ba20 Dec 02 09:20:57 crc kubenswrapper[4781]: W1202 09:20:57.345221 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode10258da_dad3_4df8_82c2_9d9438493a3d.slice/crio-4481851fe1614067281daa531c6e06d1fb8c1fee202b64a27fe727ca749dbfdc WatchSource:0}: Error finding container 4481851fe1614067281daa531c6e06d1fb8c1fee202b64a27fe727ca749dbfdc: Status 404 returned error can't find the container with id 4481851fe1614067281daa531c6e06d1fb8c1fee202b64a27fe727ca749dbfdc Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.400658 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8b6p8"] Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.401058 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.401883 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x5x7g"] Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.402450 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.403239 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.403437 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.404483 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.406133 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.406192 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.406286 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.406349 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.406403 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.406940 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.415873 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.429063 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.439480 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.449399 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.459663 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.470145 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.482147 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.495999 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.498933 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:20:57 crc kubenswrapper[4781]: E1202 09:20:57.499016 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.503108 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-slash\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.503136 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-systemd-units\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.503153 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-log-socket\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.503171 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-os-release\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.503188 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-multus-cni-dir\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.503391 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-cnibin\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.503496 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-multus-conf-dir\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.503535 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-etc-kubernetes\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.503559 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovnkube-script-lib\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.503609 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-var-lib-openvswitch\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.503775 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-var-lib-cni-bin\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.503956 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-run-netns\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504034 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-hostroot\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504062 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-run-multus-certs\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504090 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-run-netns\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504120 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504156 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-cni-binary-copy\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504279 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-run-k8s-cni-cncf-io\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504306 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxbk6\" (UniqueName: \"kubernetes.io/projected/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-kube-api-access-zxbk6\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504333 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovnkube-config\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504354 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-env-overrides\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504376 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovn-node-metrics-cert\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504423 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-cni-netd\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504466 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-multus-socket-dir-parent\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504490 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-node-log\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504510 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-run-ovn-kubernetes\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504548 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-systemd\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504568 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-etc-openvswitch\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504599 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-ovn\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504623 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxmlf\" (UniqueName: \"kubernetes.io/projected/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-kube-api-access-mxmlf\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504708 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-openvswitch\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504747 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-multus-daemon-config\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504776 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-system-cni-dir\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504802 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-var-lib-cni-multus\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504838 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-kubelet\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.504861 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-cni-bin\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.505034 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-var-lib-kubelet\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.514126 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.514330 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.515578 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.518827 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.519755 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.520819 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.521402 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.522822 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.523459 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.524540 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.525075 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.526018 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.526644 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.527621 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.528255 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.528992 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.529527 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.530902 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.531412 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.532208 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.532633 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.533419 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.533977 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.535083 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.535552 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.536593 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.537314 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.538613 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.539464 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.539992 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.541294 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.541768 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.542270 4781 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.542720 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.544003 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.544385 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.544913 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.545363 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.547114 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.548447 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.549086 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.550253 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.551237 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.552336 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.553391 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.554455 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.555269 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.558306 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.558880 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.559933 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.560481 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.561904 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.563050 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.564081 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.564663 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.565778 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.567068 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.567067 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.567705 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.580488 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.593934 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.603944 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kgbn2" event={"ID":"9440f972-ed59-4852-a180-3d5a2111f966","Type":"ContainerStarted","Data":"7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa"} Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.603986 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kgbn2" event={"ID":"9440f972-ed59-4852-a180-3d5a2111f966","Type":"ContainerStarted","Data":"b5f88a8e48f68555ab8d82e1626ecd0ccd276fb296e657d89552fd43692fc433"} Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.605721 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab"} Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.605766 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f"} Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.605780 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"4481851fe1614067281daa531c6e06d1fb8c1fee202b64a27fe727ca749dbfdc"} Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606376 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovn-node-metrics-cert\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606405 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-cni-binary-copy\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606424 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-run-k8s-cni-cncf-io\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606439 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxbk6\" (UniqueName: \"kubernetes.io/projected/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-kube-api-access-zxbk6\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606453 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovnkube-config\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606466 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-env-overrides\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606482 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-cni-netd\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606496 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-run-ovn-kubernetes\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606519 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-multus-socket-dir-parent\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606534 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-node-log\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606549 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-systemd\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606563 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-etc-openvswitch\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606579 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-ovn\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606593 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxmlf\" (UniqueName: \"kubernetes.io/projected/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-kube-api-access-mxmlf\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606607 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-openvswitch\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606628 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-system-cni-dir\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606642 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-var-lib-cni-multus\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606658 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-multus-daemon-config\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606675 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-var-lib-kubelet\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606690 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-kubelet\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606704 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-cni-bin\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606718 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-slash\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606733 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-os-release\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606746 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-systemd-units\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606761 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-log-socket\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606776 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-multus-cni-dir\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606789 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-cnibin\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606802 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-multus-conf-dir\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606815 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-etc-kubernetes\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606829 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovnkube-script-lib\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606846 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-var-lib-openvswitch\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606860 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-var-lib-cni-bin\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606881 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-run-netns\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606897 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-hostroot\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606915 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-run-multus-certs\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606963 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-run-netns\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.606978 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.607032 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.607064 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-cni-netd\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.607091 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-run-ovn-kubernetes\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.607125 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-multus-socket-dir-parent\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.607147 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-node-log\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.607146 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-env-overrides\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.607168 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-systemd\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.607190 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-etc-openvswitch\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.607219 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-ovn\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.607484 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-openvswitch\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.607520 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-system-cni-dir\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.607543 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-var-lib-cni-multus\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.608133 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-multus-daemon-config\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.608180 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-var-lib-kubelet\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.608212 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-kubelet\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.608243 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-cni-bin\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.608276 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-slash\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.608327 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-os-release\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.608359 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-systemd-units\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.608423 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-log-socket\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.608583 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-multus-cni-dir\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.608631 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-cnibin\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.608662 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-multus-conf-dir\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.608690 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-etc-kubernetes\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.609283 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovnkube-script-lib\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.609329 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-var-lib-openvswitch\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.609352 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-var-lib-cni-bin\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.609372 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-run-netns\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.609394 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-hostroot\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.609414 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-run-multus-certs\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.609434 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-run-netns\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.609613 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.609669 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-host-run-k8s-cni-cncf-io\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.609880 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovn-node-metrics-cert\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.610316 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovnkube-config\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.610363 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-cni-binary-copy\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.611020 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" event={"ID":"3aa44429-873b-48f6-bc13-55745827d8fa","Type":"ContainerStarted","Data":"66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b"} Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.611053 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" event={"ID":"3aa44429-873b-48f6-bc13-55745827d8fa","Type":"ContainerStarted","Data":"4ab1add8dae69032f5231cf93729c1ea4f60759f17f6b5adbec4248f8637ba20"} Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.616992 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.617510 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.619321 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0" exitCode=255 Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.619351 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0"} Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.619393 4781 scope.go:117] "RemoveContainer" containerID="1e4ac81bbe6c15c4abc298de57f6a660b436587517adc7398afad02d8a189766" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.627241 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.629907 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.630559 4781 scope.go:117] "RemoveContainer" containerID="e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0" Dec 02 09:20:57 crc kubenswrapper[4781]: E1202 09:20:57.630871 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.631268 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxbk6\" (UniqueName: \"kubernetes.io/projected/d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650-kube-api-access-zxbk6\") pod \"multus-8b6p8\" (UID: \"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\") " pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.631635 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxmlf\" (UniqueName: \"kubernetes.io/projected/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-kube-api-access-mxmlf\") pod \"ovnkube-node-x5x7g\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.639529 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.651024 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.668678 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.679691 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.689521 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.708260 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.717240 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.728742 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.730795 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8b6p8" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.739360 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.742648 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: W1202 09:20:57.749530 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20ba2af9_1f67_4b6d_884a_666ef4f55bf3.slice/crio-b8eacffb0f7f6a79c2f334742927c660a6811b6721ff20d186ed2a5024c1209c WatchSource:0}: Error finding container b8eacffb0f7f6a79c2f334742927c660a6811b6721ff20d186ed2a5024c1209c: Status 404 returned error can't find the container with id b8eacffb0f7f6a79c2f334742927c660a6811b6721ff20d186ed2a5024c1209c Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.755627 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ac81bbe6c15c4abc298de57f6a660b436587517adc7398afad02d8a189766\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:50Z\\\",\\\"message\\\":\\\"W1202 09:20:39.613282 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1202 09:20:39.613911 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764667239 cert, and key in /tmp/serving-cert-3143056395/serving-signer.crt, /tmp/serving-cert-3143056395/serving-signer.key\\\\nI1202 09:20:39.835710 1 observer_polling.go:159] Starting file observer\\\\nW1202 09:20:39.838856 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1202 09:20:39.839029 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:39.840998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3143056395/tls.crt::/tmp/serving-cert-3143056395/tls.key\\\\\\\"\\\\nF1202 09:20:50.088161 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.766041 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.785012 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.801398 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.814118 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.827471 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.848517 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.888601 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.932711 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:57 crc kubenswrapper[4781]: I1202 09:20:57.969509 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.010371 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.053751 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.090817 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.128906 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.177180 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.181912 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.232688 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.257358 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.262154 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.267634 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.289944 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.317142 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.323337 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.362013 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.401120 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.442438 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.446399 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4ac81bbe6c15c4abc298de57f6a660b436587517adc7398afad02d8a189766\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:50Z\\\",\\\"message\\\":\\\"W1202 09:20:39.613282 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1202 09:20:39.613911 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764667239 cert, and key in /tmp/serving-cert-3143056395/serving-signer.crt, /tmp/serving-cert-3143056395/serving-signer.key\\\\nI1202 09:20:39.835710 1 observer_polling.go:159] Starting file observer\\\\nW1202 09:20:39.838856 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1202 09:20:39.839029 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:39.840998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3143056395/tls.crt::/tmp/serving-cert-3143056395/tls.key\\\\\\\"\\\\nF1202 09:20:50.088161 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.474136 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.476059 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.476089 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.476101 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.476193 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.487564 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.498774 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:20:58 crc kubenswrapper[4781]: E1202 09:20:58.498913 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.498911 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:20:58 crc kubenswrapper[4781]: E1202 09:20:58.499014 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.542444 4781 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.542750 4781 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.543987 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.544022 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.544033 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.544048 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.544059 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:58Z","lastTransitionTime":"2025-12-02T09:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:58 crc kubenswrapper[4781]: E1202 09:20:58.559948 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.563458 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.563498 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.563506 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.563521 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.563530 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:58Z","lastTransitionTime":"2025-12-02T09:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:58 crc kubenswrapper[4781]: E1202 09:20:58.576284 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.577799 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.579178 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.579204 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.579215 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.579231 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.579242 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:58Z","lastTransitionTime":"2025-12-02T09:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.580916 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 02 09:20:58 crc kubenswrapper[4781]: E1202 09:20:58.590063 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.593128 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.593154 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.593162 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.593174 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.593183 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:58Z","lastTransitionTime":"2025-12-02T09:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.601825 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 09:20:58 crc kubenswrapper[4781]: E1202 09:20:58.603687 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.606328 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.606364 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.606375 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.606391 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.606405 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:58Z","lastTransitionTime":"2025-12-02T09:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:58 crc kubenswrapper[4781]: E1202 09:20:58.621100 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: E1202 09:20:58.621215 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.622364 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.622391 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.622399 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.622414 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.622422 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:58Z","lastTransitionTime":"2025-12-02T09:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.623262 4781 generic.go:334] "Generic (PLEG): container finished" podID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerID="b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9" exitCode=0 Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.623281 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerDied","Data":"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9"} Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.623311 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerStarted","Data":"b8eacffb0f7f6a79c2f334742927c660a6811b6721ff20d186ed2a5024c1209c"} Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.624966 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.627777 4781 scope.go:117] "RemoveContainer" containerID="e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0" Dec 02 09:20:58 crc kubenswrapper[4781]: E1202 09:20:58.628114 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.628123 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8b6p8" event={"ID":"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650","Type":"ContainerStarted","Data":"9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9"} Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.628350 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8b6p8" event={"ID":"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650","Type":"ContainerStarted","Data":"1148f0e0db916e1c90c6eb14aa67abbb4af03c208337feef68eb487ce7a23126"} Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.629074 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e"} Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.630238 4781 generic.go:334] "Generic (PLEG): container finished" podID="3aa44429-873b-48f6-bc13-55745827d8fa" containerID="66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b" exitCode=0 Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.630371 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" event={"ID":"3aa44429-873b-48f6-bc13-55745827d8fa","Type":"ContainerDied","Data":"66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b"} Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.651915 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.689306 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.703430 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.722540 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.725873 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.726095 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.726217 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.726357 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.726479 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:58Z","lastTransitionTime":"2025-12-02T09:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.763348 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.793874 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.801483 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.820677 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.839280 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.839328 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.839343 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.839364 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.839375 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:58Z","lastTransitionTime":"2025-12-02T09:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.869370 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.880873 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.929102 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.942695 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.942733 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.942745 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.942762 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.942775 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:58Z","lastTransitionTime":"2025-12-02T09:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:58 crc kubenswrapper[4781]: I1202 09:20:58.970059 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:58Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.017897 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.045718 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.045758 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.045768 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.045782 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.045792 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:59Z","lastTransitionTime":"2025-12-02T09:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.054143 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.094824 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.139516 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.147466 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.147512 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.147525 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.147542 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.147553 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:59Z","lastTransitionTime":"2025-12-02T09:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.172065 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.210512 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.222185 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.222296 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:20:59 crc kubenswrapper[4781]: E1202 09:20:59.222362 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:21:03.222335242 +0000 UTC m=+26.046209121 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:20:59 crc kubenswrapper[4781]: E1202 09:20:59.222398 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.222443 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:20:59 crc kubenswrapper[4781]: E1202 09:20:59.222447 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:03.222433375 +0000 UTC m=+26.046307254 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.222483 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.222508 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:20:59 crc kubenswrapper[4781]: E1202 09:20:59.222587 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 09:20:59 crc kubenswrapper[4781]: E1202 09:20:59.222615 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 09:20:59 crc kubenswrapper[4781]: E1202 09:20:59.222647 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 09:20:59 crc kubenswrapper[4781]: E1202 09:20:59.222659 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:20:59 crc kubenswrapper[4781]: E1202 09:20:59.222615 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 09:20:59 crc kubenswrapper[4781]: E1202 09:20:59.222723 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 09:20:59 crc kubenswrapper[4781]: E1202 09:20:59.222736 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:20:59 crc kubenswrapper[4781]: E1202 09:20:59.222627 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:03.222620861 +0000 UTC m=+26.046494740 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 09:20:59 crc kubenswrapper[4781]: E1202 09:20:59.222791 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:03.222774107 +0000 UTC m=+26.046648056 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:20:59 crc kubenswrapper[4781]: E1202 09:20:59.222804 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:03.222797728 +0000 UTC m=+26.046671707 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.250819 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.250879 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.250894 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.250914 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.250952 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:59Z","lastTransitionTime":"2025-12-02T09:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.255863 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.292296 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.327879 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.353385 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.353437 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.353455 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.353472 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.353483 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:59Z","lastTransitionTime":"2025-12-02T09:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.373770 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.455720 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.455772 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.455783 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.455804 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.455816 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:59Z","lastTransitionTime":"2025-12-02T09:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.499369 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:20:59 crc kubenswrapper[4781]: E1202 09:20:59.499497 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.558706 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.558770 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.558785 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.558809 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.558821 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:59Z","lastTransitionTime":"2025-12-02T09:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.617893 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-ffk9s"] Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.618472 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ffk9s" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.620680 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.621738 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.622248 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.622518 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.628845 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.634055 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.636403 4781 generic.go:334] "Generic (PLEG): container finished" podID="3aa44429-873b-48f6-bc13-55745827d8fa" containerID="0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c" exitCode=0 Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.636523 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" event={"ID":"3aa44429-873b-48f6-bc13-55745827d8fa","Type":"ContainerDied","Data":"0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c"} Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.643609 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerStarted","Data":"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd"} Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.643686 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerStarted","Data":"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1"} Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.643707 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerStarted","Data":"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61"} Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.643722 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerStarted","Data":"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7"} Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.643735 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerStarted","Data":"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91"} Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.643750 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerStarted","Data":"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec"} Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.644469 4781 scope.go:117] "RemoveContainer" containerID="e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0" Dec 02 09:20:59 crc kubenswrapper[4781]: E1202 09:20:59.644617 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.662394 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.662451 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.662469 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.662497 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.662515 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:59Z","lastTransitionTime":"2025-12-02T09:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.670524 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.691472 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.708289 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.723363 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.728646 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04fe11ee-efe6-4b10-a638-021e53367e2d-host\") pod \"node-ca-ffk9s\" (UID: \"04fe11ee-efe6-4b10-a638-021e53367e2d\") " pod="openshift-image-registry/node-ca-ffk9s" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.728676 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/04fe11ee-efe6-4b10-a638-021e53367e2d-serviceca\") pod \"node-ca-ffk9s\" (UID: \"04fe11ee-efe6-4b10-a638-021e53367e2d\") " pod="openshift-image-registry/node-ca-ffk9s" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.728758 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46c6d\" (UniqueName: \"kubernetes.io/projected/04fe11ee-efe6-4b10-a638-021e53367e2d-kube-api-access-46c6d\") pod \"node-ca-ffk9s\" (UID: \"04fe11ee-efe6-4b10-a638-021e53367e2d\") " pod="openshift-image-registry/node-ca-ffk9s" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.735286 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.749084 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.765036 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.765084 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.765099 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.765119 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.765135 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:59Z","lastTransitionTime":"2025-12-02T09:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.770223 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.808508 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.829225 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46c6d\" (UniqueName: \"kubernetes.io/projected/04fe11ee-efe6-4b10-a638-021e53367e2d-kube-api-access-46c6d\") pod \"node-ca-ffk9s\" (UID: \"04fe11ee-efe6-4b10-a638-021e53367e2d\") " pod="openshift-image-registry/node-ca-ffk9s" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.829263 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04fe11ee-efe6-4b10-a638-021e53367e2d-host\") pod \"node-ca-ffk9s\" (UID: \"04fe11ee-efe6-4b10-a638-021e53367e2d\") " pod="openshift-image-registry/node-ca-ffk9s" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.829278 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/04fe11ee-efe6-4b10-a638-021e53367e2d-serviceca\") pod \"node-ca-ffk9s\" (UID: \"04fe11ee-efe6-4b10-a638-021e53367e2d\") " pod="openshift-image-registry/node-ca-ffk9s" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.829625 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04fe11ee-efe6-4b10-a638-021e53367e2d-host\") pod \"node-ca-ffk9s\" (UID: \"04fe11ee-efe6-4b10-a638-021e53367e2d\") " pod="openshift-image-registry/node-ca-ffk9s" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.830488 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/04fe11ee-efe6-4b10-a638-021e53367e2d-serviceca\") pod \"node-ca-ffk9s\" (UID: \"04fe11ee-efe6-4b10-a638-021e53367e2d\") " pod="openshift-image-registry/node-ca-ffk9s" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.858033 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.867480 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.867518 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.867526 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.867540 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.867560 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:59Z","lastTransitionTime":"2025-12-02T09:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.879005 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46c6d\" (UniqueName: \"kubernetes.io/projected/04fe11ee-efe6-4b10-a638-021e53367e2d-kube-api-access-46c6d\") pod \"node-ca-ffk9s\" (UID: \"04fe11ee-efe6-4b10-a638-021e53367e2d\") " pod="openshift-image-registry/node-ca-ffk9s" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.916089 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.944218 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ffk9s" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.954671 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:20:59 crc kubenswrapper[4781]: W1202 09:20:59.957618 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04fe11ee_efe6_4b10_a638_021e53367e2d.slice/crio-1d764afaa1212a4c87d43c46bd0223afd21cfa17d38e97d8303fc6a463acadfa WatchSource:0}: Error finding container 1d764afaa1212a4c87d43c46bd0223afd21cfa17d38e97d8303fc6a463acadfa: Status 404 returned error can't find the container with id 1d764afaa1212a4c87d43c46bd0223afd21cfa17d38e97d8303fc6a463acadfa Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.970718 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.970768 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.970800 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.970823 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.970841 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:20:59Z","lastTransitionTime":"2025-12-02T09:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:20:59 crc kubenswrapper[4781]: I1202 09:20:59.986219 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:20:59Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.029386 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.072539 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.073517 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.073550 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.073559 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.073572 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.073581 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:00Z","lastTransitionTime":"2025-12-02T09:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.119407 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.165761 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.176047 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.176082 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.176092 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.176107 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.176118 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:00Z","lastTransitionTime":"2025-12-02T09:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.194024 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.229673 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.268884 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.278417 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.278449 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.278459 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.278475 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.278487 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:00Z","lastTransitionTime":"2025-12-02T09:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.309802 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.351068 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.381555 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.381607 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.381619 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.381635 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.381646 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:00Z","lastTransitionTime":"2025-12-02T09:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.394495 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.429478 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.469563 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.483152 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.483197 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.483209 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.483226 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.483240 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:00Z","lastTransitionTime":"2025-12-02T09:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.499511 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.499513 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:00 crc kubenswrapper[4781]: E1202 09:21:00.499684 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:00 crc kubenswrapper[4781]: E1202 09:21:00.499622 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.510284 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.550368 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.585449 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.585477 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.585485 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.585498 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.585508 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:00Z","lastTransitionTime":"2025-12-02T09:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.590331 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.631392 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.646993 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ffk9s" event={"ID":"04fe11ee-efe6-4b10-a638-021e53367e2d","Type":"ContainerStarted","Data":"2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4"} Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.647043 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ffk9s" event={"ID":"04fe11ee-efe6-4b10-a638-021e53367e2d","Type":"ContainerStarted","Data":"1d764afaa1212a4c87d43c46bd0223afd21cfa17d38e97d8303fc6a463acadfa"} Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.648788 4781 generic.go:334] "Generic (PLEG): container finished" podID="3aa44429-873b-48f6-bc13-55745827d8fa" containerID="0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f" exitCode=0 Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.648822 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" event={"ID":"3aa44429-873b-48f6-bc13-55745827d8fa","Type":"ContainerDied","Data":"0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f"} Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.672522 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.687550 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.687587 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.687600 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.687616 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.687627 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:00Z","lastTransitionTime":"2025-12-02T09:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.709853 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.749811 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.790230 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.790272 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.790283 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.790321 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.790332 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:00Z","lastTransitionTime":"2025-12-02T09:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.790476 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.830365 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.871188 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.892809 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.892853 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.892865 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.892879 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.892890 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:00Z","lastTransitionTime":"2025-12-02T09:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.910035 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.950043 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.989540 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.995300 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.995347 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.995358 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.995374 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:00 crc kubenswrapper[4781]: I1202 09:21:00.995387 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:00Z","lastTransitionTime":"2025-12-02T09:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.029402 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.071157 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.098249 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.098354 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.098372 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.098430 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.098448 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:01Z","lastTransitionTime":"2025-12-02T09:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.109015 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.157447 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.188466 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.201603 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.201637 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.201648 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.201661 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.201672 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:01Z","lastTransitionTime":"2025-12-02T09:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.235253 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.276827 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.304370 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.304405 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.304413 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.304426 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.304435 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:01Z","lastTransitionTime":"2025-12-02T09:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.406678 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.406720 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.406729 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.406743 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.406752 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:01Z","lastTransitionTime":"2025-12-02T09:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.499072 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:01 crc kubenswrapper[4781]: E1202 09:21:01.499219 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.508718 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.508758 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.508767 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.508779 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.508789 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:01Z","lastTransitionTime":"2025-12-02T09:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.610736 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.610793 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.610817 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.610839 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.610855 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:01Z","lastTransitionTime":"2025-12-02T09:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.653232 4781 generic.go:334] "Generic (PLEG): container finished" podID="3aa44429-873b-48f6-bc13-55745827d8fa" containerID="a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4" exitCode=0 Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.653272 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" event={"ID":"3aa44429-873b-48f6-bc13-55745827d8fa","Type":"ContainerDied","Data":"a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4"} Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.656554 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerStarted","Data":"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726"} Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.674074 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.686136 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.710025 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.712579 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.712642 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.712657 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.712678 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.712693 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:01Z","lastTransitionTime":"2025-12-02T09:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.723303 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.735771 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.748589 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.760123 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.773580 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.790446 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.803994 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.814990 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.815029 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.815041 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.815061 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.815073 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:01Z","lastTransitionTime":"2025-12-02T09:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.821484 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.833083 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.844823 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.855345 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.872748 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:01Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.918514 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.918571 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.918583 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.918601 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:01 crc kubenswrapper[4781]: I1202 09:21:01.918613 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:01Z","lastTransitionTime":"2025-12-02T09:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.021040 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.021084 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.021101 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.021118 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.021133 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:02Z","lastTransitionTime":"2025-12-02T09:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.123343 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.123707 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.123716 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.123735 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.123745 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:02Z","lastTransitionTime":"2025-12-02T09:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.226266 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.226327 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.226344 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.226366 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.226383 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:02Z","lastTransitionTime":"2025-12-02T09:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.329283 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.329350 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.329365 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.329395 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.329412 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:02Z","lastTransitionTime":"2025-12-02T09:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.432614 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.432658 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.432670 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.432687 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.432699 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:02Z","lastTransitionTime":"2025-12-02T09:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.499000 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:02 crc kubenswrapper[4781]: E1202 09:21:02.499266 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.500160 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:02 crc kubenswrapper[4781]: E1202 09:21:02.500319 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.534830 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.534918 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.534963 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.534984 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.534999 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:02Z","lastTransitionTime":"2025-12-02T09:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.637467 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.637527 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.637548 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.637575 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.637598 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:02Z","lastTransitionTime":"2025-12-02T09:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.662149 4781 generic.go:334] "Generic (PLEG): container finished" podID="3aa44429-873b-48f6-bc13-55745827d8fa" containerID="ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be" exitCode=0 Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.662202 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" event={"ID":"3aa44429-873b-48f6-bc13-55745827d8fa","Type":"ContainerDied","Data":"ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be"} Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.675117 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:02Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.689008 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:02Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.706067 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:02Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.723039 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:02Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.740242 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:02Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.740506 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.740544 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.740559 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.740575 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.740585 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:02Z","lastTransitionTime":"2025-12-02T09:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.751945 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:02Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.766359 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:02Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.782477 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:02Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.798750 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:02Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.812888 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:02Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.833503 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:02Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.843066 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.843144 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.843163 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.843194 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.843211 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:02Z","lastTransitionTime":"2025-12-02T09:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.851091 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:02Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.868052 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:02Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.883972 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:02Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.914331 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:02Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.946238 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.946304 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.946320 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.946348 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:02 crc kubenswrapper[4781]: I1202 09:21:02.946364 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:02Z","lastTransitionTime":"2025-12-02T09:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.049161 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.049209 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.049226 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.049248 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.049263 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:03Z","lastTransitionTime":"2025-12-02T09:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.152559 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.152602 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.152613 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.152629 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.152641 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:03Z","lastTransitionTime":"2025-12-02T09:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.255764 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.255829 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.255846 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.255873 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.255891 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:03Z","lastTransitionTime":"2025-12-02T09:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.264825 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:21:03 crc kubenswrapper[4781]: E1202 09:21:03.265129 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:21:11.26509669 +0000 UTC m=+34.088970579 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.265245 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.265340 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.265415 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.265490 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:03 crc kubenswrapper[4781]: E1202 09:21:03.265424 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 09:21:03 crc kubenswrapper[4781]: E1202 09:21:03.265550 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 09:21:03 crc kubenswrapper[4781]: E1202 09:21:03.265569 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 09:21:03 crc kubenswrapper[4781]: E1202 09:21:03.265581 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:21:03 crc kubenswrapper[4781]: E1202 09:21:03.265514 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 09:21:03 crc kubenswrapper[4781]: E1202 09:21:03.265605 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:11.265584036 +0000 UTC m=+34.089457995 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 09:21:03 crc kubenswrapper[4781]: E1202 09:21:03.265626 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:11.265616967 +0000 UTC m=+34.089490976 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:21:03 crc kubenswrapper[4781]: E1202 09:21:03.265646 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:11.265637878 +0000 UTC m=+34.089511887 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 09:21:03 crc kubenswrapper[4781]: E1202 09:21:03.265697 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 09:21:03 crc kubenswrapper[4781]: E1202 09:21:03.265738 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 09:21:03 crc kubenswrapper[4781]: E1202 09:21:03.265763 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:21:03 crc kubenswrapper[4781]: E1202 09:21:03.265863 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:11.265831974 +0000 UTC m=+34.089705893 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.358453 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.358504 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.358514 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.358531 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.358543 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:03Z","lastTransitionTime":"2025-12-02T09:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.461196 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.461230 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.461238 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.461250 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.461259 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:03Z","lastTransitionTime":"2025-12-02T09:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.498846 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:03 crc kubenswrapper[4781]: E1202 09:21:03.498981 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.563503 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.563540 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.563551 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.563567 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.563576 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:03Z","lastTransitionTime":"2025-12-02T09:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.667405 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.667472 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.667486 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.667507 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.667521 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:03Z","lastTransitionTime":"2025-12-02T09:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.670468 4781 generic.go:334] "Generic (PLEG): container finished" podID="3aa44429-873b-48f6-bc13-55745827d8fa" containerID="ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca" exitCode=0 Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.670520 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" event={"ID":"3aa44429-873b-48f6-bc13-55745827d8fa","Type":"ContainerDied","Data":"ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca"} Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.688407 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:03Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.703410 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:03Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.716139 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:03Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.730823 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:03Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.744894 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:03Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.758530 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:03Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.769511 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.769574 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.769587 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.769630 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.769648 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:03Z","lastTransitionTime":"2025-12-02T09:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.774833 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:03Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.792404 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:03Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.806624 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:03Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.818727 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:03Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.829033 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:03Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.848906 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:03Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.871972 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:03Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.872961 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.873008 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.873020 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.873040 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.873053 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:03Z","lastTransitionTime":"2025-12-02T09:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.888047 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:03Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.900903 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:03Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.975082 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.975117 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.975127 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.975141 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:03 crc kubenswrapper[4781]: I1202 09:21:03.975150 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:03Z","lastTransitionTime":"2025-12-02T09:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.076992 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.077031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.077039 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.077054 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.077062 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:04Z","lastTransitionTime":"2025-12-02T09:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.180047 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.180285 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.180293 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.180307 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.180318 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:04Z","lastTransitionTime":"2025-12-02T09:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.282994 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.283031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.283045 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.283063 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.283076 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:04Z","lastTransitionTime":"2025-12-02T09:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.386193 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.386253 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.386270 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.386293 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.386311 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:04Z","lastTransitionTime":"2025-12-02T09:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.489230 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.489272 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.489285 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.489301 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.489310 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:04Z","lastTransitionTime":"2025-12-02T09:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.498581 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.498616 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:04 crc kubenswrapper[4781]: E1202 09:21:04.498698 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:04 crc kubenswrapper[4781]: E1202 09:21:04.498834 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.591689 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.591734 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.591749 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.591769 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.591785 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:04Z","lastTransitionTime":"2025-12-02T09:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.677125 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerStarted","Data":"33484f154354873117c207446d695a79f3b9dac0e9772306330a8dec40414768"} Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.677790 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.680451 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" event={"ID":"3aa44429-873b-48f6-bc13-55745827d8fa","Type":"ContainerStarted","Data":"c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965"} Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.692030 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.693730 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.693828 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.693901 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.693997 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.694070 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:04Z","lastTransitionTime":"2025-12-02T09:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.705607 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.723902 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.738060 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.746946 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.749446 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.765535 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.778116 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.797504 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.797556 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.797565 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.797584 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.797595 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:04Z","lastTransitionTime":"2025-12-02T09:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.799145 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.811510 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.821255 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.830028 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.838178 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.852285 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.862389 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.882131 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33484f154354873117c207446d695a79f3b9dac0e9772306330a8dec40414768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.899271 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.899302 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.899313 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.899325 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.899336 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:04Z","lastTransitionTime":"2025-12-02T09:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.899294 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.910676 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.920861 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.931144 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.939315 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.950231 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.958649 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.974087 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33484f154354873117c207446d695a79f3b9dac0e9772306330a8dec40414768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.983962 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:04 crc kubenswrapper[4781]: I1202 09:21:04.993457 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:04Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.002008 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.002054 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.002066 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.002079 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.002087 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:05Z","lastTransitionTime":"2025-12-02T09:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.005651 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.014695 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.023850 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.035898 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.046653 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.104611 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.104646 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.104657 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.104672 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.104686 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:05Z","lastTransitionTime":"2025-12-02T09:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.206623 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.206653 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.206664 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.206676 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.206684 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:05Z","lastTransitionTime":"2025-12-02T09:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.308507 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.308556 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.308570 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.308589 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.308601 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:05Z","lastTransitionTime":"2025-12-02T09:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.411126 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.411166 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.411177 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.411192 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.411201 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:05Z","lastTransitionTime":"2025-12-02T09:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.498647 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:05 crc kubenswrapper[4781]: E1202 09:21:05.498773 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.513566 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.513603 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.513613 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.513628 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.513639 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:05Z","lastTransitionTime":"2025-12-02T09:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.616116 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.616160 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.616169 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.616185 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.616194 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:05Z","lastTransitionTime":"2025-12-02T09:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.684597 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.685902 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.709285 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.719540 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.719593 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.719608 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.719633 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.719651 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:05Z","lastTransitionTime":"2025-12-02T09:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.725018 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.742162 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.757325 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.771230 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.785895 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.798230 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.808179 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.821709 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.821734 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.821743 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.821755 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.821773 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:05Z","lastTransitionTime":"2025-12-02T09:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.854983 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.896615 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.907679 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.916682 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.923217 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.923236 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.923246 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.923258 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.923268 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:05Z","lastTransitionTime":"2025-12-02T09:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.935196 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.944547 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.960258 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33484f154354873117c207446d695a79f3b9dac0e9772306330a8dec40414768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:05 crc kubenswrapper[4781]: I1202 09:21:05.973149 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:05Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.025789 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.025832 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.025844 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.025861 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.025882 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:06Z","lastTransitionTime":"2025-12-02T09:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.128502 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.128534 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.128542 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.128555 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.128565 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:06Z","lastTransitionTime":"2025-12-02T09:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.230679 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.230717 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.230727 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.230742 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.230753 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:06Z","lastTransitionTime":"2025-12-02T09:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.333439 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.333492 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.333508 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.333530 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.333546 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:06Z","lastTransitionTime":"2025-12-02T09:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.436304 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.436363 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.436373 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.436390 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.436405 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:06Z","lastTransitionTime":"2025-12-02T09:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.498935 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.498976 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:06 crc kubenswrapper[4781]: E1202 09:21:06.499053 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:06 crc kubenswrapper[4781]: E1202 09:21:06.499273 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.539497 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.539533 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.539545 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.539561 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.539571 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:06Z","lastTransitionTime":"2025-12-02T09:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.642253 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.642309 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.642325 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.642348 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.642364 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:06Z","lastTransitionTime":"2025-12-02T09:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.689526 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovnkube-controller/0.log" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.692436 4781 generic.go:334] "Generic (PLEG): container finished" podID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerID="33484f154354873117c207446d695a79f3b9dac0e9772306330a8dec40414768" exitCode=1 Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.692501 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerDied","Data":"33484f154354873117c207446d695a79f3b9dac0e9772306330a8dec40414768"} Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.693295 4781 scope.go:117] "RemoveContainer" containerID="33484f154354873117c207446d695a79f3b9dac0e9772306330a8dec40414768" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.711311 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:06Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.729618 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:06Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.745206 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.745254 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.745271 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.745294 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.745315 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:06Z","lastTransitionTime":"2025-12-02T09:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.747050 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:06Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.763196 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:06Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.780318 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:06Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.800726 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:06Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.819978 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:06Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.845382 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:06Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.847488 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.847528 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.847538 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.847555 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.847564 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:06Z","lastTransitionTime":"2025-12-02T09:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.859579 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:06Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.875135 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:06Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.887630 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:06Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.897503 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:06Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.912672 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:06Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.923769 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:06Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.942198 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33484f154354873117c207446d695a79f3b9dac0e9772306330a8dec40414768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33484f154354873117c207446d695a79f3b9dac0e9772306330a8dec40414768\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:06Z\\\",\\\"message\\\":\\\"1202 09:21:05.979638 6072 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 09:21:05.979892 6072 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.979972 6072 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.980061 6072 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.980401 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:05.980450 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 09:21:05.980457 6072 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 09:21:05.980477 6072 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 09:21:05.980489 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:05.980505 6072 factory.go:656] Stopping watch factory\\\\nI1202 09:21:05.980512 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:05.980510 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 09:21:05.980518 6072 ovnkube.go:599] Stopped ovnkube\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:06Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.949402 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.949441 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.949454 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.949469 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:06 crc kubenswrapper[4781]: I1202 09:21:06.949481 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:06Z","lastTransitionTime":"2025-12-02T09:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.040379 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.052516 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.052559 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.052574 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.052592 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.052609 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:07Z","lastTransitionTime":"2025-12-02T09:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.154730 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.154769 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.154779 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.154796 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.154808 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:07Z","lastTransitionTime":"2025-12-02T09:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.257224 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.257273 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.257283 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.257297 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.257306 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:07Z","lastTransitionTime":"2025-12-02T09:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.360475 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.360553 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.360581 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.360613 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.360635 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:07Z","lastTransitionTime":"2025-12-02T09:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.463558 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.463613 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.463629 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.463650 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.463665 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:07Z","lastTransitionTime":"2025-12-02T09:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.498760 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:07 crc kubenswrapper[4781]: E1202 09:21:07.498879 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.514131 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.530261 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.543028 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.554953 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.566465 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.566505 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.566514 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.566541 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.566554 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:07Z","lastTransitionTime":"2025-12-02T09:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.575168 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.593752 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33484f154354873117c207446d695a79f3b9dac0e9772306330a8dec40414768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33484f154354873117c207446d695a79f3b9dac0e9772306330a8dec40414768\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:06Z\\\",\\\"message\\\":\\\"1202 09:21:05.979638 6072 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 09:21:05.979892 6072 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.979972 6072 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.980061 6072 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.980401 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:05.980450 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 09:21:05.980457 6072 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 09:21:05.980477 6072 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 09:21:05.980489 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:05.980505 6072 factory.go:656] Stopping watch factory\\\\nI1202 09:21:05.980512 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:05.980510 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 09:21:05.980518 6072 ovnkube.go:599] Stopped ovnkube\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.615832 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.626881 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.638045 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.650897 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.670408 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.670468 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.670481 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.670498 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.670510 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:07Z","lastTransitionTime":"2025-12-02T09:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.670760 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.683131 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.694823 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.696866 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovnkube-controller/0.log" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.699572 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerStarted","Data":"eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5"} Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.699933 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.707639 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.718626 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.726205 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.742756 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.755282 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.770165 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.772748 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.772790 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.772801 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.772818 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.772830 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:07Z","lastTransitionTime":"2025-12-02T09:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.783906 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.801209 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.811196 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.828189 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33484f154354873117c207446d695a79f3b9dac0e9772306330a8dec40414768\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:06Z\\\",\\\"message\\\":\\\"1202 09:21:05.979638 6072 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 09:21:05.979892 6072 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.979972 6072 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.980061 6072 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.980401 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:05.980450 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 09:21:05.980457 6072 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 09:21:05.980477 6072 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 09:21:05.980489 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:05.980505 6072 factory.go:656] Stopping watch factory\\\\nI1202 09:21:05.980512 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:05.980510 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 09:21:05.980518 6072 ovnkube.go:599] Stopped ovnkube\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.845369 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.857351 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.869280 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.875465 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.875501 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.875513 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.875533 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.875544 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:07Z","lastTransitionTime":"2025-12-02T09:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.882264 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.895555 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.915439 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.927535 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:07Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.978421 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.978458 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.978469 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.978486 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:07 crc kubenswrapper[4781]: I1202 09:21:07.978498 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:07Z","lastTransitionTime":"2025-12-02T09:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.080728 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.080769 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.080781 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.080796 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.080805 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:08Z","lastTransitionTime":"2025-12-02T09:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.183580 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.183624 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.183638 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.183656 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.183671 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:08Z","lastTransitionTime":"2025-12-02T09:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.287079 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.287120 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.287157 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.287174 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.287185 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:08Z","lastTransitionTime":"2025-12-02T09:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.390273 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.390331 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.390349 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.390373 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.390390 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:08Z","lastTransitionTime":"2025-12-02T09:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.493719 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.493797 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.493818 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.493848 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.493872 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:08Z","lastTransitionTime":"2025-12-02T09:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.499198 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.499201 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:08 crc kubenswrapper[4781]: E1202 09:21:08.499542 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:08 crc kubenswrapper[4781]: E1202 09:21:08.499382 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.596883 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.596933 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.596948 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.596964 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.596973 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:08Z","lastTransitionTime":"2025-12-02T09:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.699387 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.699447 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.699457 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.699472 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.699481 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:08Z","lastTransitionTime":"2025-12-02T09:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.703146 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovnkube-controller/1.log" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.703626 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovnkube-controller/0.log" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.705863 4781 generic.go:334] "Generic (PLEG): container finished" podID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerID="eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5" exitCode=1 Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.705897 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerDied","Data":"eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5"} Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.705983 4781 scope.go:117] "RemoveContainer" containerID="33484f154354873117c207446d695a79f3b9dac0e9772306330a8dec40414768" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.709001 4781 scope.go:117] "RemoveContainer" containerID="eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5" Dec 02 09:21:08 crc kubenswrapper[4781]: E1202 09:21:08.709247 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.725158 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.737428 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.751076 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.756547 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.756593 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.756602 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.756618 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.756628 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:08Z","lastTransitionTime":"2025-12-02T09:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.762314 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: E1202 09:21:08.767899 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.770788 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.770833 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.770842 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.770856 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.770865 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:08Z","lastTransitionTime":"2025-12-02T09:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.774564 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: E1202 09:21:08.782632 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.788288 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.788319 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.788329 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.788346 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.788360 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:08Z","lastTransitionTime":"2025-12-02T09:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.789234 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: E1202 09:21:08.798838 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.801780 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.801820 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.801828 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.801842 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.801851 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:08Z","lastTransitionTime":"2025-12-02T09:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.804487 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: E1202 09:21:08.813874 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.816484 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.816508 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.816516 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.816528 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.816538 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:08Z","lastTransitionTime":"2025-12-02T09:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.822709 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: E1202 09:21:08.826237 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: E1202 09:21:08.826350 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.827597 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.827628 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.827638 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.827652 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.827662 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:08Z","lastTransitionTime":"2025-12-02T09:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.834907 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.845673 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.856700 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.857475 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5"] Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.857904 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.858889 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.859598 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.866131 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.878738 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.887066 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.902586 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33484f154354873117c207446d695a79f3b9dac0e9772306330a8dec40414768\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:06Z\\\",\\\"message\\\":\\\"1202 09:21:05.979638 6072 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 09:21:05.979892 6072 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.979972 6072 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.980061 6072 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.980401 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:05.980450 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 09:21:05.980457 6072 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 09:21:05.980477 6072 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 09:21:05.980489 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:05.980505 6072 factory.go:656] Stopping watch factory\\\\nI1202 09:21:05.980512 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:05.980510 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 09:21:05.980518 6072 ovnkube.go:599] Stopped ovnkube\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"9:21:07.519021 6215 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 09:21:07.519076 6215 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:07.519053 6215 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:07.519095 6215 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 09:21:07.519185 6215 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:07.519206 6215 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:07.519268 6215 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:07.519291 6215 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:07.519319 6215 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:07.519340 6215 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:07.519369 6215 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 09:21:07.519393 6215 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:07.519396 6215 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:07.519408 6215 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:07.519457 6215 factory.go:656] Stopping watch factory\\\\nI1202 09:21:07.519478 6215 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.914683 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.920296 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be967da7-3e7f-47e6-9d54-408ae99531a6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7k2h5\" (UID: \"be967da7-3e7f-47e6-9d54-408ae99531a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.920333 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be967da7-3e7f-47e6-9d54-408ae99531a6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7k2h5\" (UID: \"be967da7-3e7f-47e6-9d54-408ae99531a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.920355 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be967da7-3e7f-47e6-9d54-408ae99531a6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7k2h5\" (UID: \"be967da7-3e7f-47e6-9d54-408ae99531a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.920371 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzmcq\" (UniqueName: \"kubernetes.io/projected/be967da7-3e7f-47e6-9d54-408ae99531a6-kube-api-access-dzmcq\") pod \"ovnkube-control-plane-749d76644c-7k2h5\" (UID: \"be967da7-3e7f-47e6-9d54-408ae99531a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.926243 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.929426 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.929457 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.929469 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.929484 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.929495 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:08Z","lastTransitionTime":"2025-12-02T09:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.937132 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.950680 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.968757 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.981196 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:08 crc kubenswrapper[4781]: I1202 09:21:08.993078 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:08Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.005889 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.020830 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be967da7-3e7f-47e6-9d54-408ae99531a6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7k2h5\" (UID: \"be967da7-3e7f-47e6-9d54-408ae99531a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.020872 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be967da7-3e7f-47e6-9d54-408ae99531a6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7k2h5\" (UID: \"be967da7-3e7f-47e6-9d54-408ae99531a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.020896 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be967da7-3e7f-47e6-9d54-408ae99531a6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7k2h5\" (UID: \"be967da7-3e7f-47e6-9d54-408ae99531a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.020918 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzmcq\" (UniqueName: \"kubernetes.io/projected/be967da7-3e7f-47e6-9d54-408ae99531a6-kube-api-access-dzmcq\") pod \"ovnkube-control-plane-749d76644c-7k2h5\" (UID: \"be967da7-3e7f-47e6-9d54-408ae99531a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.021712 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be967da7-3e7f-47e6-9d54-408ae99531a6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7k2h5\" (UID: \"be967da7-3e7f-47e6-9d54-408ae99531a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.022167 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be967da7-3e7f-47e6-9d54-408ae99531a6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7k2h5\" (UID: \"be967da7-3e7f-47e6-9d54-408ae99531a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.027790 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be967da7-3e7f-47e6-9d54-408ae99531a6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7k2h5\" (UID: \"be967da7-3e7f-47e6-9d54-408ae99531a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.040589 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzmcq\" (UniqueName: \"kubernetes.io/projected/be967da7-3e7f-47e6-9d54-408ae99531a6-kube-api-access-dzmcq\") pod \"ovnkube-control-plane-749d76644c-7k2h5\" (UID: \"be967da7-3e7f-47e6-9d54-408ae99531a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.070527 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.070562 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.070570 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.070586 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.070598 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:09Z","lastTransitionTime":"2025-12-02T09:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.073781 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.087181 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.098944 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.120570 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.146662 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33484f154354873117c207446d695a79f3b9dac0e9772306330a8dec40414768\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:06Z\\\",\\\"message\\\":\\\"1202 09:21:05.979638 6072 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 09:21:05.979892 6072 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.979972 6072 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.980061 6072 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.980401 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:05.980450 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 09:21:05.980457 6072 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 09:21:05.980477 6072 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 09:21:05.980489 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:05.980505 6072 factory.go:656] Stopping watch factory\\\\nI1202 09:21:05.980512 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:05.980510 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 09:21:05.980518 6072 ovnkube.go:599] Stopped ovnkube\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"9:21:07.519021 6215 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 09:21:07.519076 6215 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:07.519053 6215 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:07.519095 6215 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 09:21:07.519185 6215 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:07.519206 6215 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:07.519268 6215 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:07.519291 6215 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:07.519319 6215 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:07.519340 6215 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:07.519369 6215 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 09:21:07.519393 6215 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:07.519396 6215 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:07.519408 6215 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:07.519457 6215 factory.go:656] Stopping watch factory\\\\nI1202 09:21:07.519478 6215 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.161621 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.170805 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.173245 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.173278 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.173292 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.173309 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.173325 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:09Z","lastTransitionTime":"2025-12-02T09:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.178619 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.193203 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.275764 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.275948 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.275958 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.275971 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.276000 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:09Z","lastTransitionTime":"2025-12-02T09:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.377450 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.377475 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.377483 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.377494 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.377503 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:09Z","lastTransitionTime":"2025-12-02T09:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.481110 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.481166 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.481180 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.481198 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.481211 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:09Z","lastTransitionTime":"2025-12-02T09:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.499410 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:09 crc kubenswrapper[4781]: E1202 09:21:09.499573 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.583462 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.583498 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.583506 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.583520 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.583529 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:09Z","lastTransitionTime":"2025-12-02T09:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.686253 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.686284 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.686292 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.686306 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.686315 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:09Z","lastTransitionTime":"2025-12-02T09:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.710533 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" event={"ID":"be967da7-3e7f-47e6-9d54-408ae99531a6","Type":"ContainerStarted","Data":"735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d"} Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.710575 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" event={"ID":"be967da7-3e7f-47e6-9d54-408ae99531a6","Type":"ContainerStarted","Data":"b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0"} Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.710589 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" event={"ID":"be967da7-3e7f-47e6-9d54-408ae99531a6","Type":"ContainerStarted","Data":"9dd2672cc69b9bc18a1187dd0d93cb1da149516edd1626bd676cfeb4b0fe6c53"} Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.712173 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovnkube-controller/1.log" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.715406 4781 scope.go:117] "RemoveContainer" containerID="eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5" Dec 02 09:21:09 crc kubenswrapper[4781]: E1202 09:21:09.715532 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.725020 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.734787 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.758562 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.771498 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.786104 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.790526 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.790719 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.790813 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.790906 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.791025 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:09Z","lastTransitionTime":"2025-12-02T09:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.805373 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.815053 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.834676 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33484f154354873117c207446d695a79f3b9dac0e9772306330a8dec40414768\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:06Z\\\",\\\"message\\\":\\\"1202 09:21:05.979638 6072 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 09:21:05.979892 6072 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.979972 6072 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.980061 6072 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 09:21:05.980401 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:05.980450 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 09:21:05.980457 6072 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 09:21:05.980477 6072 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 09:21:05.980489 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:05.980505 6072 factory.go:656] Stopping watch factory\\\\nI1202 09:21:05.980512 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:05.980510 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 09:21:05.980518 6072 ovnkube.go:599] Stopped ovnkube\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"9:21:07.519021 6215 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 09:21:07.519076 6215 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:07.519053 6215 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:07.519095 6215 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 09:21:07.519185 6215 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:07.519206 6215 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:07.519268 6215 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:07.519291 6215 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:07.519319 6215 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:07.519340 6215 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:07.519369 6215 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 09:21:07.519393 6215 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:07.519396 6215 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:07.519408 6215 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:07.519457 6215 factory.go:656] Stopping watch factory\\\\nI1202 09:21:07.519478 6215 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.845413 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.856604 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.867994 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.880530 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.891130 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.892725 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.892774 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.892789 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.892809 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.892827 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:09Z","lastTransitionTime":"2025-12-02T09:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.901499 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.917507 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.932782 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.946291 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.959967 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.978071 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.990332 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:09Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.995490 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.995527 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.995539 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.995556 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:09 crc kubenswrapper[4781]: I1202 09:21:09.995567 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:09Z","lastTransitionTime":"2025-12-02T09:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.002841 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.017245 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.029491 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.040483 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.049093 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.067068 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.077646 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.097864 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.097896 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.097907 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.097937 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.097948 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:10Z","lastTransitionTime":"2025-12-02T09:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.098192 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"9:21:07.519021 6215 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 09:21:07.519076 6215 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:07.519053 6215 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:07.519095 6215 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 09:21:07.519185 6215 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:07.519206 6215 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:07.519268 6215 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:07.519291 6215 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:07.519319 6215 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:07.519340 6215 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:07.519369 6215 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 09:21:07.519393 6215 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:07.519396 6215 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:07.519408 6215 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:07.519457 6215 factory.go:656] Stopping watch factory\\\\nI1202 09:21:07.519478 6215 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.109315 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.122320 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.132111 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.144023 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.200887 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.200946 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.200970 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.200991 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.201004 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:10Z","lastTransitionTime":"2025-12-02T09:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.304418 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.304470 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.304488 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.304511 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.304529 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:10Z","lastTransitionTime":"2025-12-02T09:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.336856 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-q792g"] Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.337337 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:10 crc kubenswrapper[4781]: E1202 09:21:10.337404 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.352132 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.371540 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.391871 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.406890 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.407024 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.407035 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.407048 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.407057 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:10Z","lastTransitionTime":"2025-12-02T09:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.411848 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.432163 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.433391 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl4xt\" (UniqueName: \"kubernetes.io/projected/bcdae8ff-3e82-4785-b958-a98717a14787-kube-api-access-vl4xt\") pod \"network-metrics-daemon-q792g\" (UID: \"bcdae8ff-3e82-4785-b958-a98717a14787\") " pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.433530 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs\") pod \"network-metrics-daemon-q792g\" (UID: \"bcdae8ff-3e82-4785-b958-a98717a14787\") " pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.453521 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.468606 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q792g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdae8ff-3e82-4785-b958-a98717a14787\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q792g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.481845 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.498610 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.498766 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.498786 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:10 crc kubenswrapper[4781]: E1202 09:21:10.498955 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:10 crc kubenswrapper[4781]: E1202 09:21:10.499144 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.510206 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.510360 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.510450 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.510583 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.510665 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:10Z","lastTransitionTime":"2025-12-02T09:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.519475 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.534282 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs\") pod \"network-metrics-daemon-q792g\" (UID: \"bcdae8ff-3e82-4785-b958-a98717a14787\") " pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:10 crc kubenswrapper[4781]: E1202 09:21:10.534414 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.534422 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: E1202 09:21:10.534477 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs podName:bcdae8ff-3e82-4785-b958-a98717a14787 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:11.034459041 +0000 UTC m=+33.858332930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs") pod "network-metrics-daemon-q792g" (UID: "bcdae8ff-3e82-4785-b958-a98717a14787") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.534612 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl4xt\" (UniqueName: \"kubernetes.io/projected/bcdae8ff-3e82-4785-b958-a98717a14787-kube-api-access-vl4xt\") pod \"network-metrics-daemon-q792g\" (UID: \"bcdae8ff-3e82-4785-b958-a98717a14787\") " pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.549644 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.554498 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl4xt\" (UniqueName: \"kubernetes.io/projected/bcdae8ff-3e82-4785-b958-a98717a14787-kube-api-access-vl4xt\") pod \"network-metrics-daemon-q792g\" (UID: \"bcdae8ff-3e82-4785-b958-a98717a14787\") " pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.571657 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.586002 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.613979 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.614038 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.614056 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.614079 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.614096 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:10Z","lastTransitionTime":"2025-12-02T09:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.644196 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"9:21:07.519021 6215 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 09:21:07.519076 6215 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:07.519053 6215 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:07.519095 6215 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 09:21:07.519185 6215 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:07.519206 6215 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:07.519268 6215 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:07.519291 6215 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:07.519319 6215 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:07.519340 6215 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:07.519369 6215 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 09:21:07.519393 6215 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:07.519396 6215 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:07.519408 6215 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:07.519457 6215 factory.go:656] Stopping watch factory\\\\nI1202 09:21:07.519478 6215 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.658392 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.671679 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:10Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.716137 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.716174 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.716186 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.716201 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.716213 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:10Z","lastTransitionTime":"2025-12-02T09:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.819217 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.819270 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.819282 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.819300 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.819313 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:10Z","lastTransitionTime":"2025-12-02T09:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.921622 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.921673 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.921692 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.921720 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:10 crc kubenswrapper[4781]: I1202 09:21:10.921738 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:10Z","lastTransitionTime":"2025-12-02T09:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.024127 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.024171 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.024182 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.024198 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.024209 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:11Z","lastTransitionTime":"2025-12-02T09:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.039013 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs\") pod \"network-metrics-daemon-q792g\" (UID: \"bcdae8ff-3e82-4785-b958-a98717a14787\") " pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:11 crc kubenswrapper[4781]: E1202 09:21:11.039190 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 09:21:11 crc kubenswrapper[4781]: E1202 09:21:11.039250 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs podName:bcdae8ff-3e82-4785-b958-a98717a14787 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:12.039236658 +0000 UTC m=+34.863110537 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs") pod "network-metrics-daemon-q792g" (UID: "bcdae8ff-3e82-4785-b958-a98717a14787") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.126242 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.126279 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.126288 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.126303 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.126312 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:11Z","lastTransitionTime":"2025-12-02T09:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.229034 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.229102 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.229128 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.229155 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.229174 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:11Z","lastTransitionTime":"2025-12-02T09:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.331847 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.331891 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.331903 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.331919 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.331953 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:11Z","lastTransitionTime":"2025-12-02T09:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.341402 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.341464 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.341495 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:11 crc kubenswrapper[4781]: E1202 09:21:11.341518 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:21:27.341492825 +0000 UTC m=+50.165366724 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.341556 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:11 crc kubenswrapper[4781]: E1202 09:21:11.341587 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 09:21:11 crc kubenswrapper[4781]: E1202 09:21:11.341604 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 09:21:11 crc kubenswrapper[4781]: E1202 09:21:11.341615 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:21:11 crc kubenswrapper[4781]: E1202 09:21:11.341652 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.341599 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:11 crc kubenswrapper[4781]: E1202 09:21:11.341656 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:27.3416433 +0000 UTC m=+50.165517179 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:21:11 crc kubenswrapper[4781]: E1202 09:21:11.341702 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:27.341691862 +0000 UTC m=+50.165565751 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 09:21:11 crc kubenswrapper[4781]: E1202 09:21:11.341709 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 09:21:11 crc kubenswrapper[4781]: E1202 09:21:11.341807 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:27.341782005 +0000 UTC m=+50.165655984 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 09:21:11 crc kubenswrapper[4781]: E1202 09:21:11.341808 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 09:21:11 crc kubenswrapper[4781]: E1202 09:21:11.341860 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 09:21:11 crc kubenswrapper[4781]: E1202 09:21:11.341884 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:21:11 crc kubenswrapper[4781]: E1202 09:21:11.341989 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:27.341970951 +0000 UTC m=+50.165844860 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.434639 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.434693 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.434711 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.434733 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.434753 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:11Z","lastTransitionTime":"2025-12-02T09:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.499857 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:11 crc kubenswrapper[4781]: E1202 09:21:11.500086 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.536761 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.536823 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.536848 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.536878 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.536902 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:11Z","lastTransitionTime":"2025-12-02T09:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.640104 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.640163 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.640183 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.640205 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.640224 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:11Z","lastTransitionTime":"2025-12-02T09:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.743037 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.743094 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.743110 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.743130 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.743146 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:11Z","lastTransitionTime":"2025-12-02T09:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.845852 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.845897 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.845909 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.845945 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.845961 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:11Z","lastTransitionTime":"2025-12-02T09:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.948034 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.948087 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.948101 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.948121 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:11 crc kubenswrapper[4781]: I1202 09:21:11.948136 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:11Z","lastTransitionTime":"2025-12-02T09:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.047136 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs\") pod \"network-metrics-daemon-q792g\" (UID: \"bcdae8ff-3e82-4785-b958-a98717a14787\") " pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:12 crc kubenswrapper[4781]: E1202 09:21:12.047327 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 09:21:12 crc kubenswrapper[4781]: E1202 09:21:12.047437 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs podName:bcdae8ff-3e82-4785-b958-a98717a14787 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:14.047406984 +0000 UTC m=+36.871280893 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs") pod "network-metrics-daemon-q792g" (UID: "bcdae8ff-3e82-4785-b958-a98717a14787") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.051106 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.051166 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.051186 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.051212 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.051229 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:12Z","lastTransitionTime":"2025-12-02T09:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.155272 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.155339 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.155363 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.155408 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.155431 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:12Z","lastTransitionTime":"2025-12-02T09:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.258167 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.258203 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.258216 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.258233 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.258245 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:12Z","lastTransitionTime":"2025-12-02T09:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.360948 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.360978 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.360985 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.360998 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.361007 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:12Z","lastTransitionTime":"2025-12-02T09:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.463242 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.463318 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.463330 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.463344 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.463355 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:12Z","lastTransitionTime":"2025-12-02T09:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.499224 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.499288 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:12 crc kubenswrapper[4781]: E1202 09:21:12.499367 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:12 crc kubenswrapper[4781]: E1202 09:21:12.499501 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.499528 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:12 crc kubenswrapper[4781]: E1202 09:21:12.499787 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.565202 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.565255 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.565272 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.565295 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.565317 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:12Z","lastTransitionTime":"2025-12-02T09:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.668394 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.668459 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.668481 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.668511 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.668534 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:12Z","lastTransitionTime":"2025-12-02T09:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.771440 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.771530 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.771548 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.771570 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.771589 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:12Z","lastTransitionTime":"2025-12-02T09:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.874268 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.874345 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.874368 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.874396 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.874417 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:12Z","lastTransitionTime":"2025-12-02T09:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.976324 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.976388 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.976405 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.976430 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:12 crc kubenswrapper[4781]: I1202 09:21:12.976447 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:12Z","lastTransitionTime":"2025-12-02T09:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.079641 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.079701 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.079718 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.079753 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.079772 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:13Z","lastTransitionTime":"2025-12-02T09:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.182628 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.182656 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.182666 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.182678 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.182688 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:13Z","lastTransitionTime":"2025-12-02T09:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.285614 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.285644 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.285653 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.285665 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.285673 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:13Z","lastTransitionTime":"2025-12-02T09:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.388721 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.388762 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.388771 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.388807 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.388820 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:13Z","lastTransitionTime":"2025-12-02T09:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.491507 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.491665 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.491692 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.491724 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.491746 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:13Z","lastTransitionTime":"2025-12-02T09:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.498790 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:13 crc kubenswrapper[4781]: E1202 09:21:13.499021 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.594104 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.594173 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.594191 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.594220 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.594238 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:13Z","lastTransitionTime":"2025-12-02T09:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.697734 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.697805 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.697833 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.697869 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.697894 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:13Z","lastTransitionTime":"2025-12-02T09:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.800492 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.801029 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.801051 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.801074 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.801091 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:13Z","lastTransitionTime":"2025-12-02T09:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.904075 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.904134 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.904152 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.904176 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:13 crc kubenswrapper[4781]: I1202 09:21:13.904193 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:13Z","lastTransitionTime":"2025-12-02T09:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.006695 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.006758 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.006781 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.006807 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.006832 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:14Z","lastTransitionTime":"2025-12-02T09:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.064381 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs\") pod \"network-metrics-daemon-q792g\" (UID: \"bcdae8ff-3e82-4785-b958-a98717a14787\") " pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:14 crc kubenswrapper[4781]: E1202 09:21:14.064591 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 09:21:14 crc kubenswrapper[4781]: E1202 09:21:14.064701 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs podName:bcdae8ff-3e82-4785-b958-a98717a14787 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:18.064636948 +0000 UTC m=+40.888510857 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs") pod "network-metrics-daemon-q792g" (UID: "bcdae8ff-3e82-4785-b958-a98717a14787") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.109383 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.109423 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.109432 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.109447 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.109456 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:14Z","lastTransitionTime":"2025-12-02T09:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.211449 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.211489 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.211517 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.211532 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.211542 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:14Z","lastTransitionTime":"2025-12-02T09:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.314146 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.314216 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.314235 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.314259 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.314289 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:14Z","lastTransitionTime":"2025-12-02T09:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.416870 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.416947 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.416960 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.416977 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.416990 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:14Z","lastTransitionTime":"2025-12-02T09:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.499595 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.499667 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:14 crc kubenswrapper[4781]: E1202 09:21:14.499769 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.499631 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:14 crc kubenswrapper[4781]: E1202 09:21:14.499991 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:14 crc kubenswrapper[4781]: E1202 09:21:14.500440 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.500574 4781 scope.go:117] "RemoveContainer" containerID="e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.519490 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.519555 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.519571 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.519593 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.519616 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:14Z","lastTransitionTime":"2025-12-02T09:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.621499 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.621534 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.621544 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.621557 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.621585 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:14Z","lastTransitionTime":"2025-12-02T09:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.724315 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.724363 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.724373 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.724385 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.724395 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:14Z","lastTransitionTime":"2025-12-02T09:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.733241 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.735096 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa"} Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.735548 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.759715 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:14Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.774733 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:14Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.793839 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:14Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.805607 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:14Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.815520 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:14Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.826734 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.826770 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.826779 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.826793 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.826803 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:14Z","lastTransitionTime":"2025-12-02T09:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.834860 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:14Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.847425 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:14Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.872960 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"9:21:07.519021 6215 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 09:21:07.519076 6215 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:07.519053 6215 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:07.519095 6215 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 09:21:07.519185 6215 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:07.519206 6215 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:07.519268 6215 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:07.519291 6215 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:07.519319 6215 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:07.519340 6215 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:07.519369 6215 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 09:21:07.519393 6215 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:07.519396 6215 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:07.519408 6215 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:07.519457 6215 factory.go:656] Stopping watch factory\\\\nI1202 09:21:07.519478 6215 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:14Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.887105 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:14Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.903536 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:14Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.921295 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:14Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.929125 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.929162 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.929172 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.929185 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.929195 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:14Z","lastTransitionTime":"2025-12-02T09:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.936210 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:14Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.950065 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:14Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.963199 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:14Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.976871 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:14Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:14 crc kubenswrapper[4781]: I1202 09:21:14.989314 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:14Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:14.999979 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q792g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdae8ff-3e82-4785-b958-a98717a14787\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q792g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:14Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.031899 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.031968 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.031980 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.031996 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.032008 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:15Z","lastTransitionTime":"2025-12-02T09:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.134763 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.134796 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.134804 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.134818 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.134827 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:15Z","lastTransitionTime":"2025-12-02T09:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.236665 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.236731 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.236740 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.236755 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.236764 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:15Z","lastTransitionTime":"2025-12-02T09:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.339552 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.339609 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.339621 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.339638 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.339650 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:15Z","lastTransitionTime":"2025-12-02T09:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.443077 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.443130 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.443146 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.443168 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.443183 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:15Z","lastTransitionTime":"2025-12-02T09:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.499311 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:15 crc kubenswrapper[4781]: E1202 09:21:15.499453 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.545830 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.545866 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.545878 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.545894 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.545907 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:15Z","lastTransitionTime":"2025-12-02T09:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.648038 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.648078 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.648089 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.648103 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.648115 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:15Z","lastTransitionTime":"2025-12-02T09:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.750800 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.750851 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.750865 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.750884 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.750898 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:15Z","lastTransitionTime":"2025-12-02T09:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.854184 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.854243 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.854261 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.854287 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.854304 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:15Z","lastTransitionTime":"2025-12-02T09:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.957329 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.957388 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.957411 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.957441 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:15 crc kubenswrapper[4781]: I1202 09:21:15.957462 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:15Z","lastTransitionTime":"2025-12-02T09:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.059737 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.060175 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.060362 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.060517 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.060701 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:16Z","lastTransitionTime":"2025-12-02T09:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.164398 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.164450 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.164468 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.164491 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.164508 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:16Z","lastTransitionTime":"2025-12-02T09:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.268388 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.268452 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.268470 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.268494 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.268511 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:16Z","lastTransitionTime":"2025-12-02T09:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.371259 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.371302 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.371319 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.371342 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.371359 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:16Z","lastTransitionTime":"2025-12-02T09:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.473836 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.473890 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.473907 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.473959 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.473976 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:16Z","lastTransitionTime":"2025-12-02T09:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.498615 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.498659 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.498681 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:16 crc kubenswrapper[4781]: E1202 09:21:16.498791 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:16 crc kubenswrapper[4781]: E1202 09:21:16.498895 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:16 crc kubenswrapper[4781]: E1202 09:21:16.499053 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.577234 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.577271 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.577279 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.577292 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.577300 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:16Z","lastTransitionTime":"2025-12-02T09:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.679658 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.679712 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.679734 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.679759 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.679779 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:16Z","lastTransitionTime":"2025-12-02T09:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.782444 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.782517 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.782535 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.782562 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.782580 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:16Z","lastTransitionTime":"2025-12-02T09:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.884706 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.884850 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.884862 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.884879 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.884891 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:16Z","lastTransitionTime":"2025-12-02T09:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.986857 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.986891 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.986902 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.986915 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:16 crc kubenswrapper[4781]: I1202 09:21:16.986951 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:16Z","lastTransitionTime":"2025-12-02T09:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.089263 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.089295 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.089351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.089366 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.089375 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:17Z","lastTransitionTime":"2025-12-02T09:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.191853 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.191892 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.191902 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.191917 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.191945 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:17Z","lastTransitionTime":"2025-12-02T09:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.293912 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.293993 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.294011 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.294031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.294044 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:17Z","lastTransitionTime":"2025-12-02T09:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.395976 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.396020 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.396033 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.396050 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.396065 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:17Z","lastTransitionTime":"2025-12-02T09:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.498870 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:17 crc kubenswrapper[4781]: E1202 09:21:17.499042 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.499144 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.499210 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.499225 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.499245 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.499257 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:17Z","lastTransitionTime":"2025-12-02T09:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.515563 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:17Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.528948 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:17Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.561946 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:17Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.578215 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:17Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.592979 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:17Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.601733 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.601795 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.601812 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.601838 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.601855 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:17Z","lastTransitionTime":"2025-12-02T09:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.612184 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:17Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.624420 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:17Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.648657 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"9:21:07.519021 6215 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 09:21:07.519076 6215 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:07.519053 6215 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:07.519095 6215 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 09:21:07.519185 6215 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:07.519206 6215 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:07.519268 6215 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:07.519291 6215 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:07.519319 6215 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:07.519340 6215 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:07.519369 6215 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 09:21:07.519393 6215 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:07.519396 6215 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:07.519408 6215 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:07.519457 6215 factory.go:656] Stopping watch factory\\\\nI1202 09:21:07.519478 6215 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:17Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.663106 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:17Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.675707 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:17Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.691507 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:17Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.703506 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.703514 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q792g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdae8ff-3e82-4785-b958-a98717a14787\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q792g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:17Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.703584 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.703600 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.703621 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.703638 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:17Z","lastTransitionTime":"2025-12-02T09:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.720631 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:17Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.732103 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:17Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.742957 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:17Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.802146 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:17Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.805352 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.805389 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.805396 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.805412 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.805422 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:17Z","lastTransitionTime":"2025-12-02T09:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.814536 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:17Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.908291 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.908317 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.908325 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.908338 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:17 crc kubenswrapper[4781]: I1202 09:21:17.908350 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:17Z","lastTransitionTime":"2025-12-02T09:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.011282 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.011317 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.011325 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.011339 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.011347 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:18Z","lastTransitionTime":"2025-12-02T09:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.105407 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs\") pod \"network-metrics-daemon-q792g\" (UID: \"bcdae8ff-3e82-4785-b958-a98717a14787\") " pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:18 crc kubenswrapper[4781]: E1202 09:21:18.105624 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 09:21:18 crc kubenswrapper[4781]: E1202 09:21:18.105733 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs podName:bcdae8ff-3e82-4785-b958-a98717a14787 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:26.105703818 +0000 UTC m=+48.929577737 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs") pod "network-metrics-daemon-q792g" (UID: "bcdae8ff-3e82-4785-b958-a98717a14787") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.114054 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.114156 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.114185 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.114214 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.114236 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:18Z","lastTransitionTime":"2025-12-02T09:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.217293 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.217364 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.217386 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.217419 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.217440 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:18Z","lastTransitionTime":"2025-12-02T09:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.320113 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.320155 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.320164 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.320181 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.320190 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:18Z","lastTransitionTime":"2025-12-02T09:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.422794 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.422846 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.422864 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.422886 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.422902 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:18Z","lastTransitionTime":"2025-12-02T09:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.499113 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.499146 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.499194 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:18 crc kubenswrapper[4781]: E1202 09:21:18.499262 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:18 crc kubenswrapper[4781]: E1202 09:21:18.499405 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:18 crc kubenswrapper[4781]: E1202 09:21:18.499484 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.525353 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.525409 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.525427 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.525453 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.525471 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:18Z","lastTransitionTime":"2025-12-02T09:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.628764 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.628826 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.628843 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.628867 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.628884 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:18Z","lastTransitionTime":"2025-12-02T09:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.731216 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.731367 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.731408 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.731434 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.731454 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:18Z","lastTransitionTime":"2025-12-02T09:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.834135 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.834482 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.834627 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.834799 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.834964 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:18Z","lastTransitionTime":"2025-12-02T09:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.938474 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.938518 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.938529 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.938549 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:18 crc kubenswrapper[4781]: I1202 09:21:18.938561 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:18Z","lastTransitionTime":"2025-12-02T09:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.040453 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.040498 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.040516 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.040532 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.040545 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:19Z","lastTransitionTime":"2025-12-02T09:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.123318 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.123354 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.123364 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.123377 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.123389 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:19Z","lastTransitionTime":"2025-12-02T09:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:19 crc kubenswrapper[4781]: E1202 09:21:19.137204 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:19Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.142639 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.142674 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.142690 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.142706 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.142718 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:19Z","lastTransitionTime":"2025-12-02T09:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:19 crc kubenswrapper[4781]: E1202 09:21:19.161773 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:19Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.165909 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.165962 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.165973 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.165987 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.165996 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:19Z","lastTransitionTime":"2025-12-02T09:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:19 crc kubenswrapper[4781]: E1202 09:21:19.177100 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:19Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.180871 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.180941 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.180952 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.180965 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.180973 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:19Z","lastTransitionTime":"2025-12-02T09:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:19 crc kubenswrapper[4781]: E1202 09:21:19.193235 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:19Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.196425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.196458 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.196470 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.196484 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.196495 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:19Z","lastTransitionTime":"2025-12-02T09:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:19 crc kubenswrapper[4781]: E1202 09:21:19.208622 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:19Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:19 crc kubenswrapper[4781]: E1202 09:21:19.208763 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.210364 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.210389 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.210398 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.210409 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.210418 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:19Z","lastTransitionTime":"2025-12-02T09:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.313318 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.313392 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.313416 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.313445 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.313469 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:19Z","lastTransitionTime":"2025-12-02T09:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.416098 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.416143 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.416154 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.416169 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.416180 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:19Z","lastTransitionTime":"2025-12-02T09:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.499169 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:19 crc kubenswrapper[4781]: E1202 09:21:19.499414 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.518814 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.518897 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.518952 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.518985 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.519006 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:19Z","lastTransitionTime":"2025-12-02T09:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.621661 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.621691 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.621701 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.621717 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.621727 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:19Z","lastTransitionTime":"2025-12-02T09:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.724243 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.724347 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.724376 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.724411 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.724435 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:19Z","lastTransitionTime":"2025-12-02T09:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.826945 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.826985 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.826996 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.827011 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.827023 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:19Z","lastTransitionTime":"2025-12-02T09:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.929235 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.929276 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.929311 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.929330 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:19 crc kubenswrapper[4781]: I1202 09:21:19.929340 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:19Z","lastTransitionTime":"2025-12-02T09:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.033112 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.033188 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.033209 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.033238 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.033256 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:20Z","lastTransitionTime":"2025-12-02T09:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.137290 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.137359 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.137378 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.137411 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.137433 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:20Z","lastTransitionTime":"2025-12-02T09:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.239519 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.239584 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.239601 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.239625 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.239642 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:20Z","lastTransitionTime":"2025-12-02T09:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.342502 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.342567 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.342585 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.342606 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.342623 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:20Z","lastTransitionTime":"2025-12-02T09:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.445637 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.445695 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.445712 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.445738 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.445756 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:20Z","lastTransitionTime":"2025-12-02T09:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.499014 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.499014 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.499129 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:20 crc kubenswrapper[4781]: E1202 09:21:20.499176 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:20 crc kubenswrapper[4781]: E1202 09:21:20.499464 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:20 crc kubenswrapper[4781]: E1202 09:21:20.499349 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.548089 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.548134 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.548147 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.548163 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.548176 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:20Z","lastTransitionTime":"2025-12-02T09:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.651415 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.651476 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.651498 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.651526 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.651548 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:20Z","lastTransitionTime":"2025-12-02T09:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.753973 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.754032 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.754048 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.754071 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.754087 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:20Z","lastTransitionTime":"2025-12-02T09:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.861993 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.862127 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.862153 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.862185 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.862206 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:20Z","lastTransitionTime":"2025-12-02T09:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.965017 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.965057 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.965066 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.965082 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:20 crc kubenswrapper[4781]: I1202 09:21:20.965091 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:20Z","lastTransitionTime":"2025-12-02T09:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.067799 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.067835 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.067846 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.067877 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.067890 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:21Z","lastTransitionTime":"2025-12-02T09:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.171202 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.171519 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.171609 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.171724 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.171946 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:21Z","lastTransitionTime":"2025-12-02T09:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.275431 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.275487 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.275503 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.275526 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.275543 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:21Z","lastTransitionTime":"2025-12-02T09:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.378351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.378413 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.378425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.378438 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.378447 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:21Z","lastTransitionTime":"2025-12-02T09:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.481401 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.481466 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.481484 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.481509 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.481527 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:21Z","lastTransitionTime":"2025-12-02T09:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.498702 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:21 crc kubenswrapper[4781]: E1202 09:21:21.499320 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.499799 4781 scope.go:117] "RemoveContainer" containerID="eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.583513 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.583558 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.583569 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.583593 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.583606 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:21Z","lastTransitionTime":"2025-12-02T09:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.686051 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.686093 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.686104 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.686121 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.686133 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:21Z","lastTransitionTime":"2025-12-02T09:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.762272 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovnkube-controller/1.log" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.764797 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerStarted","Data":"c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60"} Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.765344 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.781167 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:21Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.790150 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.790219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.790235 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.790254 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.790268 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:21Z","lastTransitionTime":"2025-12-02T09:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.793238 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:21Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.804020 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:21Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.812448 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q792g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdae8ff-3e82-4785-b958-a98717a14787\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q792g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:21Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.823965 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:21Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.835170 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:21Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.847377 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:21Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.863787 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:21Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.880217 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:21Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.892818 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.892870 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.892881 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.892915 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.892943 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:21Z","lastTransitionTime":"2025-12-02T09:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.893852 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:21Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.909533 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:21Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.939104 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:21Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.957314 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:21Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.973791 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:21Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.987023 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:21Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.995400 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.995434 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.995442 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.995458 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:21 crc kubenswrapper[4781]: I1202 09:21:21.995474 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:21Z","lastTransitionTime":"2025-12-02T09:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.003323 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"9:21:07.519021 6215 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 09:21:07.519076 6215 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:07.519053 6215 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:07.519095 6215 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 09:21:07.519185 6215 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:07.519206 6215 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:07.519268 6215 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:07.519291 6215 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:07.519319 6215 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:07.519340 6215 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:07.519369 6215 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 09:21:07.519393 6215 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:07.519396 6215 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:07.519408 6215 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:07.519457 6215 factory.go:656] Stopping watch factory\\\\nI1202 09:21:07.519478 6215 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:22Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.013617 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:22Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.097959 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.098019 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.098040 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.098062 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.098078 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:22Z","lastTransitionTime":"2025-12-02T09:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.200452 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.200760 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.200774 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.200791 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.200822 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:22Z","lastTransitionTime":"2025-12-02T09:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.306502 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.306561 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.306573 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.306589 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.306601 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:22Z","lastTransitionTime":"2025-12-02T09:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.408636 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.408667 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.408678 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.408691 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.408700 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:22Z","lastTransitionTime":"2025-12-02T09:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.498898 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.498966 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.499022 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:22 crc kubenswrapper[4781]: E1202 09:21:22.499104 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:22 crc kubenswrapper[4781]: E1202 09:21:22.499217 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:22 crc kubenswrapper[4781]: E1202 09:21:22.499303 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.511524 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.511564 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.511572 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.511589 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.511599 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:22Z","lastTransitionTime":"2025-12-02T09:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.614227 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.614308 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.614330 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.614360 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.614382 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:22Z","lastTransitionTime":"2025-12-02T09:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.717763 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.717816 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.717830 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.717850 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.717866 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:22Z","lastTransitionTime":"2025-12-02T09:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.771489 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovnkube-controller/2.log" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.772389 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovnkube-controller/1.log" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.776202 4781 generic.go:334] "Generic (PLEG): container finished" podID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerID="c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60" exitCode=1 Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.776248 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerDied","Data":"c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60"} Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.776284 4781 scope.go:117] "RemoveContainer" containerID="eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.777553 4781 scope.go:117] "RemoveContainer" containerID="c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60" Dec 02 09:21:22 crc kubenswrapper[4781]: E1202 09:21:22.777796 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.800105 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:22Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.820321 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:22Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.822611 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.823141 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.823168 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.823197 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.823222 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:22Z","lastTransitionTime":"2025-12-02T09:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.836022 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:22Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.850264 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q792g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdae8ff-3e82-4785-b958-a98717a14787\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q792g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:22Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.864822 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:22Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.886431 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:22Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.911769 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:22Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.926146 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.926185 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.926195 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.926209 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.926217 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:22Z","lastTransitionTime":"2025-12-02T09:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.929173 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:22Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.938872 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:22Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.948477 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:22Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.964241 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:22Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.979081 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb6d74bcbac3aadff08205392d325737c0898c1ec4a8165e3a3ff0b9e51223f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"message\\\":\\\"9:21:07.519021 6215 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 09:21:07.519076 6215 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:07.519053 6215 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:07.519095 6215 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 09:21:07.519185 6215 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:07.519206 6215 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:07.519268 6215 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:07.519291 6215 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:07.519319 6215 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:07.519340 6215 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:07.519369 6215 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 09:21:07.519393 6215 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:07.519396 6215 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:07.519408 6215 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:07.519457 6215 factory.go:656] Stopping watch factory\\\\nI1202 09:21:07.519478 6215 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:22Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 09:21:22.385723 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:22.385745 6456 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:22.385782 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:22.385800 6456 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 09:21:22.385814 6456 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:22.385845 6456 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:22.385855 6456 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 09:21:22.385863 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:22.385868 6456 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:22.385869 6456 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:22.385878 6456 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:22.385879 6456 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:22.385889 6456 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:22.385894 6456 factory.go:656] Stopping watch factory\\\\nI1202 09:21:22.385908 6456 ovnkube.go:599] Stopped ovnkube\\\\nI1202 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:22Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:22 crc kubenswrapper[4781]: I1202 09:21:22.988498 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:22Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.005944 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:23Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.015877 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:23Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.028405 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.028687 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.028909 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.029149 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.029377 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:23Z","lastTransitionTime":"2025-12-02T09:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.032166 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:23Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.045199 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:23Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.132757 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.132883 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.132979 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.133006 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.133023 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:23Z","lastTransitionTime":"2025-12-02T09:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.235836 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.235906 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.235958 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.235984 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.236002 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:23Z","lastTransitionTime":"2025-12-02T09:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.339030 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.339098 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.339115 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.339139 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.339157 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:23Z","lastTransitionTime":"2025-12-02T09:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.441237 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.441291 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.441308 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.441331 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.441348 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:23Z","lastTransitionTime":"2025-12-02T09:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.499298 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:23 crc kubenswrapper[4781]: E1202 09:21:23.499489 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.544886 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.544977 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.544995 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.545022 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.545039 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:23Z","lastTransitionTime":"2025-12-02T09:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.648384 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.648439 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.648456 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.648479 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.648496 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:23Z","lastTransitionTime":"2025-12-02T09:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.751215 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.751267 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.751284 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.751309 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.751328 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:23Z","lastTransitionTime":"2025-12-02T09:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.782245 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovnkube-controller/2.log" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.789087 4781 scope.go:117] "RemoveContainer" containerID="c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60" Dec 02 09:21:23 crc kubenswrapper[4781]: E1202 09:21:23.789353 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.813425 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:23Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.832454 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:23Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.849225 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:23Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.854532 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.854570 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.854579 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.854599 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.854613 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:23Z","lastTransitionTime":"2025-12-02T09:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.874184 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:23Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.891777 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:23Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.904055 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q792g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdae8ff-3e82-4785-b958-a98717a14787\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q792g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:23Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.925283 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:23Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.940530 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:23Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.957251 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.957325 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.957338 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.957359 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.957372 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:23Z","lastTransitionTime":"2025-12-02T09:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.958295 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:23Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.971799 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:23Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:23 crc kubenswrapper[4781]: I1202 09:21:23.984419 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:23Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.006813 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:24Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.019594 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:24Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.048812 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:22Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 09:21:22.385723 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:22.385745 6456 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:22.385782 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:22.385800 6456 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 09:21:22.385814 6456 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:22.385845 6456 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:22.385855 6456 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 09:21:22.385863 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:22.385868 6456 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:22.385869 6456 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:22.385878 6456 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:22.385879 6456 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:22.385889 6456 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:22.385894 6456 factory.go:656] Stopping watch factory\\\\nI1202 09:21:22.385908 6456 ovnkube.go:599] Stopped ovnkube\\\\nI1202 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:24Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.060801 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.060851 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.060870 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.060895 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.060915 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:24Z","lastTransitionTime":"2025-12-02T09:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.067896 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:24Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.087499 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:24Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.102565 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:24Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.163389 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.163432 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.163440 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.163452 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.163462 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:24Z","lastTransitionTime":"2025-12-02T09:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.266182 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.266227 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.266238 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.266261 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.266284 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:24Z","lastTransitionTime":"2025-12-02T09:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.369086 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.369156 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.369167 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.369184 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.369195 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:24Z","lastTransitionTime":"2025-12-02T09:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.471272 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.471336 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.471350 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.471367 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.471380 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:24Z","lastTransitionTime":"2025-12-02T09:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.499530 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.499546 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.499585 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:24 crc kubenswrapper[4781]: E1202 09:21:24.499665 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:24 crc kubenswrapper[4781]: E1202 09:21:24.499793 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:24 crc kubenswrapper[4781]: E1202 09:21:24.500072 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.574032 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.574082 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.574100 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.574121 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.574138 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:24Z","lastTransitionTime":"2025-12-02T09:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.676566 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.676643 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.676666 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.676692 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.676714 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:24Z","lastTransitionTime":"2025-12-02T09:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.778439 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.778503 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.778527 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.778554 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.778574 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:24Z","lastTransitionTime":"2025-12-02T09:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.881724 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.881771 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.881782 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.881799 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.881811 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:24Z","lastTransitionTime":"2025-12-02T09:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.983804 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.983875 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.983892 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.983916 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:24 crc kubenswrapper[4781]: I1202 09:21:24.983969 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:24Z","lastTransitionTime":"2025-12-02T09:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.086873 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.086978 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.087003 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.087031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.087052 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:25Z","lastTransitionTime":"2025-12-02T09:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.189707 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.189782 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.189800 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.189828 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.189847 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:25Z","lastTransitionTime":"2025-12-02T09:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.292904 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.293008 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.293031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.293061 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.293089 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:25Z","lastTransitionTime":"2025-12-02T09:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.395595 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.395634 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.395648 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.395662 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.395671 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:25Z","lastTransitionTime":"2025-12-02T09:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.498186 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.498404 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.498430 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.498460 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.498482 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:25Z","lastTransitionTime":"2025-12-02T09:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.498783 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:25 crc kubenswrapper[4781]: E1202 09:21:25.498903 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.601310 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.601406 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.601424 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.601451 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.601474 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:25Z","lastTransitionTime":"2025-12-02T09:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.704115 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.704204 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.704223 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.704280 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.704301 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:25Z","lastTransitionTime":"2025-12-02T09:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.807344 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.807395 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.807408 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.807424 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.807437 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:25Z","lastTransitionTime":"2025-12-02T09:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.909769 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.909818 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.909830 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.909847 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:25 crc kubenswrapper[4781]: I1202 09:21:25.909858 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:25Z","lastTransitionTime":"2025-12-02T09:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.012681 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.012728 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.012740 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.012758 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.012771 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:26Z","lastTransitionTime":"2025-12-02T09:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.115263 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.115331 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.115345 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.115363 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.115374 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:26Z","lastTransitionTime":"2025-12-02T09:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.184796 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs\") pod \"network-metrics-daemon-q792g\" (UID: \"bcdae8ff-3e82-4785-b958-a98717a14787\") " pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:26 crc kubenswrapper[4781]: E1202 09:21:26.184959 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 09:21:26 crc kubenswrapper[4781]: E1202 09:21:26.185027 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs podName:bcdae8ff-3e82-4785-b958-a98717a14787 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:42.185008821 +0000 UTC m=+65.008882710 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs") pod "network-metrics-daemon-q792g" (UID: "bcdae8ff-3e82-4785-b958-a98717a14787") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.218887 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.218967 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.219149 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.219353 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.219375 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:26Z","lastTransitionTime":"2025-12-02T09:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.322285 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.322344 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.322359 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.322374 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.322385 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:26Z","lastTransitionTime":"2025-12-02T09:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.424231 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.424345 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.424370 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.424402 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.424427 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:26Z","lastTransitionTime":"2025-12-02T09:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.499340 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.499369 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.499451 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:26 crc kubenswrapper[4781]: E1202 09:21:26.499472 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:26 crc kubenswrapper[4781]: E1202 09:21:26.499628 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:26 crc kubenswrapper[4781]: E1202 09:21:26.499740 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.526377 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.526427 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.526435 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.526449 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.526457 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:26Z","lastTransitionTime":"2025-12-02T09:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.628991 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.629053 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.629074 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.629101 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.629124 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:26Z","lastTransitionTime":"2025-12-02T09:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.731014 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.731106 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.731123 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.731181 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.731199 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:26Z","lastTransitionTime":"2025-12-02T09:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.834343 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.834400 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.834418 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.834443 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.834459 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:26Z","lastTransitionTime":"2025-12-02T09:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.936523 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.937500 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.937712 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.937978 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:26 crc kubenswrapper[4781]: I1202 09:21:26.938164 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:26Z","lastTransitionTime":"2025-12-02T09:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.040638 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.041056 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.041272 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.041466 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.041670 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:27Z","lastTransitionTime":"2025-12-02T09:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.144222 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.144298 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.144313 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.144333 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.144348 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:27Z","lastTransitionTime":"2025-12-02T09:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.245978 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.246015 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.246024 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.246039 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.246048 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:27Z","lastTransitionTime":"2025-12-02T09:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.347821 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.347855 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.347864 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.347878 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.347887 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:27Z","lastTransitionTime":"2025-12-02T09:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.398365 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:21:27 crc kubenswrapper[4781]: E1202 09:21:27.398522 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:21:59.398499221 +0000 UTC m=+82.222373140 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.398588 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.398651 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.398693 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.398731 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:27 crc kubenswrapper[4781]: E1202 09:21:27.398827 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 09:21:27 crc kubenswrapper[4781]: E1202 09:21:27.398871 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 09:21:27 crc kubenswrapper[4781]: E1202 09:21:27.398888 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 09:21:27 crc kubenswrapper[4781]: E1202 09:21:27.398939 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:59.398900355 +0000 UTC m=+82.222774244 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 09:21:27 crc kubenswrapper[4781]: E1202 09:21:27.398946 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 09:21:27 crc kubenswrapper[4781]: E1202 09:21:27.398972 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:59.398962437 +0000 UTC m=+82.222836326 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 09:21:27 crc kubenswrapper[4781]: E1202 09:21:27.398982 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:21:27 crc kubenswrapper[4781]: E1202 09:21:27.398976 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 09:21:27 crc kubenswrapper[4781]: E1202 09:21:27.399021 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 09:21:27 crc kubenswrapper[4781]: E1202 09:21:27.399041 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:21:27 crc kubenswrapper[4781]: E1202 09:21:27.399046 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:59.399026139 +0000 UTC m=+82.222900118 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:21:27 crc kubenswrapper[4781]: E1202 09:21:27.399099 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 09:21:59.399080811 +0000 UTC m=+82.222954720 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.450904 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.450983 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.451000 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.451026 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.451043 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:27Z","lastTransitionTime":"2025-12-02T09:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.499549 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:27 crc kubenswrapper[4781]: E1202 09:21:27.499746 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.513697 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q792g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdae8ff-3e82-4785-b958-a98717a14787\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q792g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:27Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.532320 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:27Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.552289 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:27Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.553165 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.553260 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.553288 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.553316 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.553334 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:27Z","lastTransitionTime":"2025-12-02T09:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.574282 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:27Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.605238 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:27Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.625157 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:27Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.642292 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:27Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.656573 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.656634 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.656654 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.656684 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.656704 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:27Z","lastTransitionTime":"2025-12-02T09:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.657265 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:27Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.688979 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:27Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.705824 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:27Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.726121 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:27Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.749552 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:27Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.763038 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.763111 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.763135 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.763169 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.763192 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:27Z","lastTransitionTime":"2025-12-02T09:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.773317 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:27Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.805149 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:22Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 09:21:22.385723 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:22.385745 6456 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:22.385782 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:22.385800 6456 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 09:21:22.385814 6456 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:22.385845 6456 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:22.385855 6456 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 09:21:22.385863 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:22.385868 6456 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:22.385869 6456 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:22.385878 6456 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:22.385879 6456 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:22.385889 6456 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:22.385894 6456 factory.go:656] Stopping watch factory\\\\nI1202 09:21:22.385908 6456 ovnkube.go:599] Stopped ovnkube\\\\nI1202 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:27Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.825076 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:27Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.847688 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:27Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.866599 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.866691 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.866714 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.866746 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.866767 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:27Z","lastTransitionTime":"2025-12-02T09:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.867728 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:27Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.970245 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.970321 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.970344 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.970376 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:27 crc kubenswrapper[4781]: I1202 09:21:27.970398 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:27Z","lastTransitionTime":"2025-12-02T09:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.072778 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.072824 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.072833 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.072848 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.072860 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:28Z","lastTransitionTime":"2025-12-02T09:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.175740 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.175786 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.175798 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.175815 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.175827 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:28Z","lastTransitionTime":"2025-12-02T09:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.278282 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.278354 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.278373 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.278396 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.278412 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:28Z","lastTransitionTime":"2025-12-02T09:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.380524 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.380600 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.380618 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.380641 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.380659 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:28Z","lastTransitionTime":"2025-12-02T09:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.483196 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.483261 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.483285 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.483314 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.483337 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:28Z","lastTransitionTime":"2025-12-02T09:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.498728 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.498754 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.498843 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:28 crc kubenswrapper[4781]: E1202 09:21:28.498997 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:28 crc kubenswrapper[4781]: E1202 09:21:28.499341 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:28 crc kubenswrapper[4781]: E1202 09:21:28.499415 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.586173 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.586214 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.586224 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.586244 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.586256 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:28Z","lastTransitionTime":"2025-12-02T09:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.688986 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.689029 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.689041 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.689060 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.689074 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:28Z","lastTransitionTime":"2025-12-02T09:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.791695 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.791729 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.791737 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.791753 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.791763 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:28Z","lastTransitionTime":"2025-12-02T09:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.895378 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.895452 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.895472 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.895498 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.895519 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:28Z","lastTransitionTime":"2025-12-02T09:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.998789 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.998829 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.998838 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.998851 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:28 crc kubenswrapper[4781]: I1202 09:21:28.998860 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:28Z","lastTransitionTime":"2025-12-02T09:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.101031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.101071 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.101080 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.101094 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.101105 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:29Z","lastTransitionTime":"2025-12-02T09:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.203420 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.203456 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.203465 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.203478 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.203487 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:29Z","lastTransitionTime":"2025-12-02T09:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.306484 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.306523 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.306533 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.306547 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.306555 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:29Z","lastTransitionTime":"2025-12-02T09:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.409527 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.409600 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.409618 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.409644 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.409661 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:29Z","lastTransitionTime":"2025-12-02T09:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.499620 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:29 crc kubenswrapper[4781]: E1202 09:21:29.499849 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.512432 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.512493 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.512513 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.512535 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.512552 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:29Z","lastTransitionTime":"2025-12-02T09:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.515958 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.516045 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.516065 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.516090 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.516110 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:29Z","lastTransitionTime":"2025-12-02T09:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:29 crc kubenswrapper[4781]: E1202 09:21:29.538033 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.544034 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.544073 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.544087 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.544105 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.544118 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:29Z","lastTransitionTime":"2025-12-02T09:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:29 crc kubenswrapper[4781]: E1202 09:21:29.563727 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.568461 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.568540 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.568566 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.568601 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.568628 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:29Z","lastTransitionTime":"2025-12-02T09:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:29 crc kubenswrapper[4781]: E1202 09:21:29.589345 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.594563 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.594607 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.594619 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.594636 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.594650 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:29Z","lastTransitionTime":"2025-12-02T09:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:29 crc kubenswrapper[4781]: E1202 09:21:29.613974 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.618473 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.618564 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.618579 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.618608 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.618623 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:29Z","lastTransitionTime":"2025-12-02T09:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:29 crc kubenswrapper[4781]: E1202 09:21:29.638007 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: E1202 09:21:29.638183 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.639962 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.640095 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.640144 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.640174 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.640192 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:29Z","lastTransitionTime":"2025-12-02T09:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.690520 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.702638 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.709621 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.735607 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.743523 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.743559 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.743571 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.743588 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.743600 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:29Z","lastTransitionTime":"2025-12-02T09:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.756125 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.776776 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.793777 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.816503 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.832856 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.846026 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.846073 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.846085 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.846102 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.846113 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:29Z","lastTransitionTime":"2025-12-02T09:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.868783 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:22Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 09:21:22.385723 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:22.385745 6456 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:22.385782 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:22.385800 6456 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 09:21:22.385814 6456 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:22.385845 6456 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:22.385855 6456 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 09:21:22.385863 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:22.385868 6456 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:22.385869 6456 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:22.385878 6456 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:22.385879 6456 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:22.385889 6456 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:22.385894 6456 factory.go:656] Stopping watch factory\\\\nI1202 09:21:22.385908 6456 ovnkube.go:599] Stopped ovnkube\\\\nI1202 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.884579 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.903986 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.919693 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.939820 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.948744 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.948864 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.948885 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.948910 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.948984 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:29Z","lastTransitionTime":"2025-12-02T09:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.958693 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.972440 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:29 crc kubenswrapper[4781]: I1202 09:21:29.992262 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:29Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.005336 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:30Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.016811 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q792g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdae8ff-3e82-4785-b958-a98717a14787\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q792g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:30Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.051844 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.051974 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.051989 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.052006 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.052017 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:30Z","lastTransitionTime":"2025-12-02T09:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.154326 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.154383 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.154402 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.154429 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.154446 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:30Z","lastTransitionTime":"2025-12-02T09:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.256688 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.256744 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.256759 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.256785 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.256805 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:30Z","lastTransitionTime":"2025-12-02T09:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.367246 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.367312 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.367325 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.367347 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.367360 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:30Z","lastTransitionTime":"2025-12-02T09:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.469578 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.469601 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.469609 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.469620 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.469629 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:30Z","lastTransitionTime":"2025-12-02T09:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.499272 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.499332 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:30 crc kubenswrapper[4781]: E1202 09:21:30.499416 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.499287 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:30 crc kubenswrapper[4781]: E1202 09:21:30.499596 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:30 crc kubenswrapper[4781]: E1202 09:21:30.499741 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.572664 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.572715 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.572728 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.572747 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.572760 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:30Z","lastTransitionTime":"2025-12-02T09:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.675166 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.675210 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.675219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.675233 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.675242 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:30Z","lastTransitionTime":"2025-12-02T09:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.777895 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.777947 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.777955 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.777970 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.777980 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:30Z","lastTransitionTime":"2025-12-02T09:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.879766 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.879827 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.879846 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.879910 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.879972 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:30Z","lastTransitionTime":"2025-12-02T09:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.981825 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.981885 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.981903 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.981961 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:30 crc kubenswrapper[4781]: I1202 09:21:30.981980 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:30Z","lastTransitionTime":"2025-12-02T09:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.084203 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.084264 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.084285 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.084314 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.084332 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:31Z","lastTransitionTime":"2025-12-02T09:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.186736 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.186807 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.186817 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.186830 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.186856 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:31Z","lastTransitionTime":"2025-12-02T09:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.290131 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.290252 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.290280 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.290312 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.290380 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:31Z","lastTransitionTime":"2025-12-02T09:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.394033 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.394146 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.394174 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.394209 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.394233 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:31Z","lastTransitionTime":"2025-12-02T09:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.497593 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.497667 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.497688 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.497722 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.497740 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:31Z","lastTransitionTime":"2025-12-02T09:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.498723 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:31 crc kubenswrapper[4781]: E1202 09:21:31.498893 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.603258 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.603323 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.603340 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.603366 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.603386 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:31Z","lastTransitionTime":"2025-12-02T09:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.706669 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.706750 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.706769 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.706801 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.706821 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:31Z","lastTransitionTime":"2025-12-02T09:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.810621 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.810736 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.810760 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.810787 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.810809 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:31Z","lastTransitionTime":"2025-12-02T09:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.914425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.914486 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.914502 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.914523 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:31 crc kubenswrapper[4781]: I1202 09:21:31.914540 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:31Z","lastTransitionTime":"2025-12-02T09:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.017836 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.017908 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.017954 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.017982 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.018001 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:32Z","lastTransitionTime":"2025-12-02T09:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.120613 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.120665 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.120681 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.120706 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.120724 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:32Z","lastTransitionTime":"2025-12-02T09:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.120746 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.166911 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.187097 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaa19b0e-b6c3-4d56-b32d-a992eb6d3773\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5775ca067ce9fccaf09f228bfcf365fd53919bf4693f5dd7943183d1f14177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c426b8000b17c761c86ae3152bb08084f63533d41a01fa039f84f0a74b89499e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e25c4d1777f7fbfc56d52dac0ec8b209807c486a294c0e7062a49e58edb520b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.202652 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.214442 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.223339 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.223396 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.223385 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.223418 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.223552 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.223562 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:32Z","lastTransitionTime":"2025-12-02T09:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.232250 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.246209 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.256734 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.274297 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:22Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 09:21:22.385723 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:22.385745 6456 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:22.385782 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:22.385800 6456 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 09:21:22.385814 6456 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:22.385845 6456 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:22.385855 6456 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 09:21:22.385863 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:22.385868 6456 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:22.385869 6456 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:22.385878 6456 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:22.385879 6456 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:22.385889 6456 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:22.385894 6456 factory.go:656] Stopping watch factory\\\\nI1202 09:21:22.385908 6456 ovnkube.go:599] Stopped ovnkube\\\\nI1202 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.288459 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.305252 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.321582 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.325292 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.325346 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.325363 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.325387 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.325404 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:32Z","lastTransitionTime":"2025-12-02T09:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.335276 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.347663 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.394017 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.408028 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.422689 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.427451 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.427491 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.427504 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.427521 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.427533 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:32Z","lastTransitionTime":"2025-12-02T09:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.435422 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q792g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdae8ff-3e82-4785-b958-a98717a14787\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q792g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:32Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.498875 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.498959 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:32 crc kubenswrapper[4781]: E1202 09:21:32.499022 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.498962 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:32 crc kubenswrapper[4781]: E1202 09:21:32.499116 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:32 crc kubenswrapper[4781]: E1202 09:21:32.499211 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.530439 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.530471 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.530507 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.530527 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.530537 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:32Z","lastTransitionTime":"2025-12-02T09:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.633261 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.633328 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.633350 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.633378 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.633401 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:32Z","lastTransitionTime":"2025-12-02T09:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.736094 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.736149 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.736167 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.736190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.736208 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:32Z","lastTransitionTime":"2025-12-02T09:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.838888 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.839015 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.839040 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.839073 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.839096 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:32Z","lastTransitionTime":"2025-12-02T09:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.941815 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.941879 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.941901 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.941973 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:32 crc kubenswrapper[4781]: I1202 09:21:32.941994 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:32Z","lastTransitionTime":"2025-12-02T09:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.044140 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.044204 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.044219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.044236 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.044249 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:33Z","lastTransitionTime":"2025-12-02T09:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.146958 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.147019 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.147036 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.147066 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.147091 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:33Z","lastTransitionTime":"2025-12-02T09:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.250689 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.250731 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.250741 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.250756 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.250767 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:33Z","lastTransitionTime":"2025-12-02T09:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.354177 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.354218 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.354229 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.354244 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.354257 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:33Z","lastTransitionTime":"2025-12-02T09:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.456565 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.456651 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.456672 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.456695 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.456711 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:33Z","lastTransitionTime":"2025-12-02T09:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.498673 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:33 crc kubenswrapper[4781]: E1202 09:21:33.498853 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.559446 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.559499 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.559517 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.559579 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.559598 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:33Z","lastTransitionTime":"2025-12-02T09:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.661958 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.662043 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.662075 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.662104 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.662125 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:33Z","lastTransitionTime":"2025-12-02T09:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.765330 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.765370 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.765382 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.765397 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.765409 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:33Z","lastTransitionTime":"2025-12-02T09:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.867675 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.867741 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.867765 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.867796 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.867818 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:33Z","lastTransitionTime":"2025-12-02T09:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.969966 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.970009 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.970018 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.970035 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:33 crc kubenswrapper[4781]: I1202 09:21:33.970043 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:33Z","lastTransitionTime":"2025-12-02T09:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.072213 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.072251 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.072263 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.072281 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.072293 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:34Z","lastTransitionTime":"2025-12-02T09:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.175308 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.175376 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.175399 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.175423 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.175440 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:34Z","lastTransitionTime":"2025-12-02T09:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.278385 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.278445 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.278463 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.278485 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.278502 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:34Z","lastTransitionTime":"2025-12-02T09:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.382191 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.382242 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.382258 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.382280 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.382299 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:34Z","lastTransitionTime":"2025-12-02T09:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.484535 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.484581 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.484591 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.484605 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.484616 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:34Z","lastTransitionTime":"2025-12-02T09:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.499036 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.499131 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:34 crc kubenswrapper[4781]: E1202 09:21:34.499198 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.499265 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:34 crc kubenswrapper[4781]: E1202 09:21:34.499356 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:34 crc kubenswrapper[4781]: E1202 09:21:34.499587 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.586802 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.586833 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.586842 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.586855 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.586863 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:34Z","lastTransitionTime":"2025-12-02T09:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.689037 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.689070 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.689080 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.689093 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.689102 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:34Z","lastTransitionTime":"2025-12-02T09:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.791107 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.791133 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.791140 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.791152 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.791160 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:34Z","lastTransitionTime":"2025-12-02T09:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.894234 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.894270 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.894279 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.894293 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.894302 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:34Z","lastTransitionTime":"2025-12-02T09:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.996723 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.996772 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.996783 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.996798 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:34 crc kubenswrapper[4781]: I1202 09:21:34.996810 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:34Z","lastTransitionTime":"2025-12-02T09:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.100437 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.100507 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.100519 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.100536 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.100575 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:35Z","lastTransitionTime":"2025-12-02T09:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.203323 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.203496 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.203514 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.203568 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.203585 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:35Z","lastTransitionTime":"2025-12-02T09:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.306873 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.307003 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.307021 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.307076 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.307094 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:35Z","lastTransitionTime":"2025-12-02T09:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.410463 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.410565 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.410584 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.410656 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.410675 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:35Z","lastTransitionTime":"2025-12-02T09:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.499700 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:35 crc kubenswrapper[4781]: E1202 09:21:35.500146 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.513097 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.513163 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.513186 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.513213 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.513236 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:35Z","lastTransitionTime":"2025-12-02T09:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.615239 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.615270 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.615296 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.615309 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.615317 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:35Z","lastTransitionTime":"2025-12-02T09:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.717887 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.717974 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.717994 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.718020 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.718037 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:35Z","lastTransitionTime":"2025-12-02T09:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.821839 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.821892 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.821909 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.821959 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.821979 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:35Z","lastTransitionTime":"2025-12-02T09:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.924841 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.924905 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.924958 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.924989 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:35 crc kubenswrapper[4781]: I1202 09:21:35.925010 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:35Z","lastTransitionTime":"2025-12-02T09:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.026664 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.026703 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.026713 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.026727 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.026739 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:36Z","lastTransitionTime":"2025-12-02T09:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.129021 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.129059 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.129073 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.129089 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.129099 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:36Z","lastTransitionTime":"2025-12-02T09:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.231400 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.231457 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.231468 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.231485 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.231497 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:36Z","lastTransitionTime":"2025-12-02T09:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.333340 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.333420 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.333442 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.333470 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.333491 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:36Z","lastTransitionTime":"2025-12-02T09:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.436815 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.436880 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.436904 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.436962 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.436987 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:36Z","lastTransitionTime":"2025-12-02T09:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.499198 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.499248 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:36 crc kubenswrapper[4781]: E1202 09:21:36.499414 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.499439 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:36 crc kubenswrapper[4781]: E1202 09:21:36.499596 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:36 crc kubenswrapper[4781]: E1202 09:21:36.499788 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.502227 4781 scope.go:117] "RemoveContainer" containerID="c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60" Dec 02 09:21:36 crc kubenswrapper[4781]: E1202 09:21:36.502817 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.540109 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.540195 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.540217 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.540245 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.540263 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:36Z","lastTransitionTime":"2025-12-02T09:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.642860 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.642910 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.642954 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.642982 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.643004 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:36Z","lastTransitionTime":"2025-12-02T09:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.746366 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.746433 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.746450 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.746474 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.746493 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:36Z","lastTransitionTime":"2025-12-02T09:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.849324 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.849374 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.849386 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.849406 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.849419 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:36Z","lastTransitionTime":"2025-12-02T09:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.951846 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.951893 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.951905 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.951948 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:36 crc kubenswrapper[4781]: I1202 09:21:36.951960 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:36Z","lastTransitionTime":"2025-12-02T09:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.053907 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.053954 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.053962 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.053994 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.054005 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:37Z","lastTransitionTime":"2025-12-02T09:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.156814 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.156854 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.156865 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.156880 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.156889 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:37Z","lastTransitionTime":"2025-12-02T09:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.259644 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.259703 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.259721 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.259744 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.259762 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:37Z","lastTransitionTime":"2025-12-02T09:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.363150 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.363195 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.363208 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.363230 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.363246 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:37Z","lastTransitionTime":"2025-12-02T09:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.465284 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.465606 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.465618 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.465636 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.465649 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:37Z","lastTransitionTime":"2025-12-02T09:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.498786 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:37 crc kubenswrapper[4781]: E1202 09:21:37.498957 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.515561 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.536202 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.552419 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.566760 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q792g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdae8ff-3e82-4785-b958-a98717a14787\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q792g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.573009 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.573057 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.573068 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.573084 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.573094 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:37Z","lastTransitionTime":"2025-12-02T09:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.585564 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.603565 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.617498 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.630584 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.641798 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.651584 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.668032 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.676156 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.676203 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.676218 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.676238 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.676252 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:37Z","lastTransitionTime":"2025-12-02T09:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.681212 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaa19b0e-b6c3-4d56-b32d-a992eb6d3773\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5775ca067ce9fccaf09f228bfcf365fd53919bf4693f5dd7943183d1f14177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c426b8000b17c761c86ae3152bb08084f63533d41a01fa039f84f0a74b89499e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e25c4d1777f7fbfc56d52dac0ec8b209807c486a294c0e7062a49e58edb520b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.702207 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:22Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 09:21:22.385723 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:22.385745 6456 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:22.385782 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:22.385800 6456 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 09:21:22.385814 6456 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:22.385845 6456 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:22.385855 6456 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 09:21:22.385863 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:22.385868 6456 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:22.385869 6456 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:22.385878 6456 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:22.385879 6456 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:22.385889 6456 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:22.385894 6456 factory.go:656] Stopping watch factory\\\\nI1202 09:21:22.385908 6456 ovnkube.go:599] Stopped ovnkube\\\\nI1202 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.714025 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.730169 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.740617 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.778396 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.778425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.778435 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.778446 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.778454 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:37Z","lastTransitionTime":"2025-12-02T09:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.789055 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.803914 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:37Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.880907 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.880961 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.880970 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.880984 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.880995 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:37Z","lastTransitionTime":"2025-12-02T09:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.983578 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.983630 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.983645 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.983667 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:37 crc kubenswrapper[4781]: I1202 09:21:37.983685 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:37Z","lastTransitionTime":"2025-12-02T09:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.085615 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.085644 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.085651 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.085664 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.085673 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:38Z","lastTransitionTime":"2025-12-02T09:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.187388 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.187416 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.187425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.187437 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.187444 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:38Z","lastTransitionTime":"2025-12-02T09:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.290377 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.290422 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.290438 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.290459 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.290475 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:38Z","lastTransitionTime":"2025-12-02T09:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.394322 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.394391 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.394409 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.394435 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.394452 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:38Z","lastTransitionTime":"2025-12-02T09:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.497559 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.497597 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.497605 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.497619 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.497632 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:38Z","lastTransitionTime":"2025-12-02T09:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.499362 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:38 crc kubenswrapper[4781]: E1202 09:21:38.499467 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.499655 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:38 crc kubenswrapper[4781]: E1202 09:21:38.499726 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.499911 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:38 crc kubenswrapper[4781]: E1202 09:21:38.499988 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.599839 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.599890 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.599913 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.599983 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.600009 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:38Z","lastTransitionTime":"2025-12-02T09:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.703465 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.703525 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.703541 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.703566 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.703584 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:38Z","lastTransitionTime":"2025-12-02T09:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.813689 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.813783 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.813805 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.813836 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.813858 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:38Z","lastTransitionTime":"2025-12-02T09:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.916275 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.916337 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.916354 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.916378 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:38 crc kubenswrapper[4781]: I1202 09:21:38.916398 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:38Z","lastTransitionTime":"2025-12-02T09:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.020368 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.020431 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.020449 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.020473 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.020491 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:39Z","lastTransitionTime":"2025-12-02T09:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.124302 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.124351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.124359 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.124378 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.124388 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:39Z","lastTransitionTime":"2025-12-02T09:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.226896 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.226952 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.226962 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.226979 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.226989 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:39Z","lastTransitionTime":"2025-12-02T09:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.330574 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.330816 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.330843 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.330866 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.330883 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:39Z","lastTransitionTime":"2025-12-02T09:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.432699 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.432778 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.432805 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.432825 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.432839 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:39Z","lastTransitionTime":"2025-12-02T09:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.499116 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:39 crc kubenswrapper[4781]: E1202 09:21:39.499339 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.536069 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.536133 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.536150 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.536176 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.536193 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:39Z","lastTransitionTime":"2025-12-02T09:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.639466 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.639511 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.639523 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.639542 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.639554 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:39Z","lastTransitionTime":"2025-12-02T09:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.742296 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.742355 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.742371 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.742393 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.742410 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:39Z","lastTransitionTime":"2025-12-02T09:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.845004 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.845038 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.845046 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.845060 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.845070 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:39Z","lastTransitionTime":"2025-12-02T09:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.897371 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.897408 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.897417 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.897429 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.897438 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:39Z","lastTransitionTime":"2025-12-02T09:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:39 crc kubenswrapper[4781]: E1202 09:21:39.910215 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:39Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.914597 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.914656 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.914668 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.914734 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.914747 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:39Z","lastTransitionTime":"2025-12-02T09:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:39 crc kubenswrapper[4781]: E1202 09:21:39.930051 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:39Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.933749 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.933779 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.933787 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.933800 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.933809 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:39Z","lastTransitionTime":"2025-12-02T09:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:39 crc kubenswrapper[4781]: E1202 09:21:39.948187 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:39Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.952184 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.952262 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.952308 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.952340 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.952360 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:39Z","lastTransitionTime":"2025-12-02T09:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:39 crc kubenswrapper[4781]: E1202 09:21:39.973327 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:39Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.977595 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.977626 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.977634 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.977646 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.977656 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:39Z","lastTransitionTime":"2025-12-02T09:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:39 crc kubenswrapper[4781]: E1202 09:21:39.991254 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:39Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:39 crc kubenswrapper[4781]: E1202 09:21:39.991400 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.993170 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.993204 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.993216 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.993232 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:39 crc kubenswrapper[4781]: I1202 09:21:39.993245 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:39Z","lastTransitionTime":"2025-12-02T09:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.095865 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.095906 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.095914 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.095962 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.095972 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:40Z","lastTransitionTime":"2025-12-02T09:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.198998 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.199045 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.199057 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.199072 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.199083 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:40Z","lastTransitionTime":"2025-12-02T09:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.301109 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.301140 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.301148 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.301161 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.301171 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:40Z","lastTransitionTime":"2025-12-02T09:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.405946 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.405990 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.406006 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.406021 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.406031 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:40Z","lastTransitionTime":"2025-12-02T09:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.498826 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.498855 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.498838 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:40 crc kubenswrapper[4781]: E1202 09:21:40.499022 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:40 crc kubenswrapper[4781]: E1202 09:21:40.499115 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:40 crc kubenswrapper[4781]: E1202 09:21:40.499231 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.508385 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.508409 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.508417 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.508430 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.508438 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:40Z","lastTransitionTime":"2025-12-02T09:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.610591 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.610630 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.610641 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.610655 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.610665 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:40Z","lastTransitionTime":"2025-12-02T09:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.713166 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.713206 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.713214 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.713229 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.713238 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:40Z","lastTransitionTime":"2025-12-02T09:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.815347 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.815392 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.815401 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.815415 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.815424 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:40Z","lastTransitionTime":"2025-12-02T09:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.918052 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.918092 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.918102 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.918117 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:40 crc kubenswrapper[4781]: I1202 09:21:40.918125 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:40Z","lastTransitionTime":"2025-12-02T09:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.020331 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.020375 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.020383 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.020396 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.020406 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:41Z","lastTransitionTime":"2025-12-02T09:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.122738 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.122796 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.122812 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.122838 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.122857 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:41Z","lastTransitionTime":"2025-12-02T09:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.224869 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.224894 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.224902 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.224916 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.224936 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:41Z","lastTransitionTime":"2025-12-02T09:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.327461 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.327500 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.327510 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.327527 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.327538 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:41Z","lastTransitionTime":"2025-12-02T09:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.430464 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.430525 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.430538 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.430560 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.430575 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:41Z","lastTransitionTime":"2025-12-02T09:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.499570 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:41 crc kubenswrapper[4781]: E1202 09:21:41.499765 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.533257 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.533310 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.533329 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.533352 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.533371 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:41Z","lastTransitionTime":"2025-12-02T09:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.635980 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.636026 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.636039 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.636057 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.636068 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:41Z","lastTransitionTime":"2025-12-02T09:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.738895 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.738951 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.738962 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.738978 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.738988 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:41Z","lastTransitionTime":"2025-12-02T09:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.841488 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.841538 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.841552 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.841573 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.841585 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:41Z","lastTransitionTime":"2025-12-02T09:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.943994 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.944040 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.944053 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.944070 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:41 crc kubenswrapper[4781]: I1202 09:21:41.944081 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:41Z","lastTransitionTime":"2025-12-02T09:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.046428 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.046478 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.046489 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.046506 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.046517 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:42Z","lastTransitionTime":"2025-12-02T09:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.148653 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.148710 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.148722 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.148742 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.148754 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:42Z","lastTransitionTime":"2025-12-02T09:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.251183 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.251220 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.251230 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.251245 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.251256 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:42Z","lastTransitionTime":"2025-12-02T09:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.277573 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs\") pod \"network-metrics-daemon-q792g\" (UID: \"bcdae8ff-3e82-4785-b958-a98717a14787\") " pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:42 crc kubenswrapper[4781]: E1202 09:21:42.277759 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 09:21:42 crc kubenswrapper[4781]: E1202 09:21:42.277837 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs podName:bcdae8ff-3e82-4785-b958-a98717a14787 nodeName:}" failed. No retries permitted until 2025-12-02 09:22:14.277818879 +0000 UTC m=+97.101692748 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs") pod "network-metrics-daemon-q792g" (UID: "bcdae8ff-3e82-4785-b958-a98717a14787") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.354278 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.354385 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.354417 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.354447 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.354469 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:42Z","lastTransitionTime":"2025-12-02T09:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.456890 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.456936 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.456946 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.456960 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.456972 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:42Z","lastTransitionTime":"2025-12-02T09:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.499332 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.499372 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.499418 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:42 crc kubenswrapper[4781]: E1202 09:21:42.499473 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:42 crc kubenswrapper[4781]: E1202 09:21:42.499614 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:42 crc kubenswrapper[4781]: E1202 09:21:42.499671 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.559098 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.559138 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.559147 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.559161 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.559170 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:42Z","lastTransitionTime":"2025-12-02T09:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.661155 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.661194 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.661203 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.661217 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.661227 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:42Z","lastTransitionTime":"2025-12-02T09:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.763902 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.763971 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.763984 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.764001 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.764013 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:42Z","lastTransitionTime":"2025-12-02T09:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.866101 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.866153 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.866164 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.866179 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.866190 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:42Z","lastTransitionTime":"2025-12-02T09:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.968596 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.968633 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.968642 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.968655 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:42 crc kubenswrapper[4781]: I1202 09:21:42.968664 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:42Z","lastTransitionTime":"2025-12-02T09:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.071031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.071067 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.071079 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.071094 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.071147 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:43Z","lastTransitionTime":"2025-12-02T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.174576 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.174624 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.174636 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.174653 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.174665 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:43Z","lastTransitionTime":"2025-12-02T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.276572 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.276617 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.276630 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.276644 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.276656 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:43Z","lastTransitionTime":"2025-12-02T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.378958 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.379003 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.379012 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.379029 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.379039 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:43Z","lastTransitionTime":"2025-12-02T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.481678 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.481714 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.481725 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.481748 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.481759 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:43Z","lastTransitionTime":"2025-12-02T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.499149 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:43 crc kubenswrapper[4781]: E1202 09:21:43.499304 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.583963 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.584024 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.584043 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.584065 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.584082 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:43Z","lastTransitionTime":"2025-12-02T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.686073 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.686134 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.686151 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.686178 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.686196 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:43Z","lastTransitionTime":"2025-12-02T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.789641 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.789727 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.789746 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.789772 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.789793 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:43Z","lastTransitionTime":"2025-12-02T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.852591 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8b6p8_d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650/kube-multus/0.log" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.852640 4781 generic.go:334] "Generic (PLEG): container finished" podID="d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650" containerID="9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9" exitCode=1 Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.852669 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8b6p8" event={"ID":"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650","Type":"ContainerDied","Data":"9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9"} Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.853037 4781 scope.go:117] "RemoveContainer" containerID="9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.872002 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:43Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.888985 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:43Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.891471 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.891508 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.891520 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.891535 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.891554 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:43Z","lastTransitionTime":"2025-12-02T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.908208 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:43Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.923461 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:43Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.935739 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:43Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.951263 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:43Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.963324 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:43Z\\\",\\\"message\\\":\\\"2025-12-02T09:20:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3fddf671-788e-42f6-aaea-cc2b46cff00f\\\\n2025-12-02T09:20:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3fddf671-788e-42f6-aaea-cc2b46cff00f to /host/opt/cni/bin/\\\\n2025-12-02T09:20:58Z [verbose] multus-daemon started\\\\n2025-12-02T09:20:58Z [verbose] Readiness Indicator file check\\\\n2025-12-02T09:21:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:43Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.973155 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q792g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdae8ff-3e82-4785-b958-a98717a14787\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q792g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:43Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.982084 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:43Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.994490 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.994513 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.994520 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.994532 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:43 crc kubenswrapper[4781]: I1202 09:21:43.994540 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:43Z","lastTransitionTime":"2025-12-02T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.000224 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:43Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.014320 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaa19b0e-b6c3-4d56-b32d-a992eb6d3773\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5775ca067ce9fccaf09f228bfcf365fd53919bf4693f5dd7943183d1f14177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c426b8000b17c761c86ae3152bb08084f63533d41a01fa039f84f0a74b89499e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e25c4d1777f7fbfc56d52dac0ec8b209807c486a294c0e7062a49e58edb520b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:44Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.031987 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:44Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.044144 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:44Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.057760 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:44Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.074280 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:44Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.084338 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:44Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.096499 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.096527 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.096536 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.096548 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.096558 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:44Z","lastTransitionTime":"2025-12-02T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.104153 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:22Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 09:21:22.385723 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:22.385745 6456 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:22.385782 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:22.385800 6456 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 09:21:22.385814 6456 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:22.385845 6456 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:22.385855 6456 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 09:21:22.385863 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:22.385868 6456 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:22.385869 6456 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:22.385878 6456 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:22.385879 6456 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:22.385889 6456 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:22.385894 6456 factory.go:656] Stopping watch factory\\\\nI1202 09:21:22.385908 6456 ovnkube.go:599] Stopped ovnkube\\\\nI1202 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:44Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.114382 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:44Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.199540 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.199578 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.199588 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.199603 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.199614 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:44Z","lastTransitionTime":"2025-12-02T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.307091 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.307141 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.307152 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.307168 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.307179 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:44Z","lastTransitionTime":"2025-12-02T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.409173 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.409201 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.409211 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.409223 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.409233 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:44Z","lastTransitionTime":"2025-12-02T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.499038 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.499062 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.499072 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:44 crc kubenswrapper[4781]: E1202 09:21:44.499179 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:44 crc kubenswrapper[4781]: E1202 09:21:44.499275 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:44 crc kubenswrapper[4781]: E1202 09:21:44.499361 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.511402 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.511429 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.511439 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.511454 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.511463 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:44Z","lastTransitionTime":"2025-12-02T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.613789 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.613841 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.613853 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.613869 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.613884 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:44Z","lastTransitionTime":"2025-12-02T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.715977 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.716018 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.716030 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.716046 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.716057 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:44Z","lastTransitionTime":"2025-12-02T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.818313 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.818350 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.818358 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.818372 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.818381 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:44Z","lastTransitionTime":"2025-12-02T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.857040 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8b6p8_d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650/kube-multus/0.log" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.857089 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8b6p8" event={"ID":"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650","Type":"ContainerStarted","Data":"a029cea19e689697b3e833a0bd6c745475464429e227868f4cabed7c7fe3ec71"} Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.880608 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:22Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 09:21:22.385723 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:22.385745 6456 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:22.385782 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:22.385800 6456 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 09:21:22.385814 6456 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:22.385845 6456 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:22.385855 6456 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 09:21:22.385863 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:22.385868 6456 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:22.385869 6456 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:22.385878 6456 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:22.385879 6456 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:22.385889 6456 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:22.385894 6456 factory.go:656] Stopping watch factory\\\\nI1202 09:21:22.385908 6456 ovnkube.go:599] Stopped ovnkube\\\\nI1202 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:44Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.898243 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:44Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.918472 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:44Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.920232 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.920277 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.920290 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.920307 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.920319 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:44Z","lastTransitionTime":"2025-12-02T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.933139 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:44Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.947367 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:44Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.960157 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:44Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.972881 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:44Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:44 crc kubenswrapper[4781]: I1202 09:21:44.990702 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:44Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.004513 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a029cea19e689697b3e833a0bd6c745475464429e227868f4cabed7c7fe3ec71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:43Z\\\",\\\"message\\\":\\\"2025-12-02T09:20:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3fddf671-788e-42f6-aaea-cc2b46cff00f\\\\n2025-12-02T09:20:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3fddf671-788e-42f6-aaea-cc2b46cff00f to /host/opt/cni/bin/\\\\n2025-12-02T09:20:58Z [verbose] multus-daemon started\\\\n2025-12-02T09:20:58Z [verbose] Readiness Indicator file check\\\\n2025-12-02T09:21:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:45Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.015514 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q792g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdae8ff-3e82-4785-b958-a98717a14787\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q792g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:45Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.023156 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.023193 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.023202 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.023216 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.023225 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:45Z","lastTransitionTime":"2025-12-02T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.027794 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:45Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.039785 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:45Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.052212 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:45Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.062989 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:45Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.074640 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:45Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.084226 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:45Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.104392 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:45Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.115411 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaa19b0e-b6c3-4d56-b32d-a992eb6d3773\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5775ca067ce9fccaf09f228bfcf365fd53919bf4693f5dd7943183d1f14177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c426b8000b17c761c86ae3152bb08084f63533d41a01fa039f84f0a74b89499e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e25c4d1777f7fbfc56d52dac0ec8b209807c486a294c0e7062a49e58edb520b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:45Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.124830 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.124856 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.124865 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.124877 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.124885 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:45Z","lastTransitionTime":"2025-12-02T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.226832 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.226879 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.226891 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.226907 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.226918 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:45Z","lastTransitionTime":"2025-12-02T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.329460 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.329513 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.329529 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.329547 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.329558 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:45Z","lastTransitionTime":"2025-12-02T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.431937 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.432000 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.432010 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.432023 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.432032 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:45Z","lastTransitionTime":"2025-12-02T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.498887 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:45 crc kubenswrapper[4781]: E1202 09:21:45.499159 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.534730 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.534782 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.534801 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.534846 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.534863 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:45Z","lastTransitionTime":"2025-12-02T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.636988 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.637043 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.637055 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.637075 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.637089 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:45Z","lastTransitionTime":"2025-12-02T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.741016 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.741057 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.741070 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.741085 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.741095 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:45Z","lastTransitionTime":"2025-12-02T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.843607 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.843656 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.843668 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.843686 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.843698 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:45Z","lastTransitionTime":"2025-12-02T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.946131 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.946165 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.946173 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.946188 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:45 crc kubenswrapper[4781]: I1202 09:21:45.946197 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:45Z","lastTransitionTime":"2025-12-02T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.048679 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.048727 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.048739 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.048756 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.048769 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:46Z","lastTransitionTime":"2025-12-02T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.151115 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.151162 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.151176 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.151194 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.151210 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:46Z","lastTransitionTime":"2025-12-02T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.253132 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.253168 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.253179 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.253194 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.253205 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:46Z","lastTransitionTime":"2025-12-02T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.355625 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.355657 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.355670 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.355685 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.355695 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:46Z","lastTransitionTime":"2025-12-02T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.458067 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.458112 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.458123 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.458140 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.458153 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:46Z","lastTransitionTime":"2025-12-02T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.498720 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.498820 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.498820 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:46 crc kubenswrapper[4781]: E1202 09:21:46.499007 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:46 crc kubenswrapper[4781]: E1202 09:21:46.499095 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:46 crc kubenswrapper[4781]: E1202 09:21:46.499196 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.560293 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.560319 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.560331 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.560345 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.560357 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:46Z","lastTransitionTime":"2025-12-02T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.662752 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.662786 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.662794 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.662806 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.662815 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:46Z","lastTransitionTime":"2025-12-02T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.766210 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.766270 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.766304 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.766336 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.766354 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:46Z","lastTransitionTime":"2025-12-02T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.868803 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.868865 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.868887 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.868914 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.868965 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:46Z","lastTransitionTime":"2025-12-02T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.972073 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.972140 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.972164 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.972194 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:46 crc kubenswrapper[4781]: I1202 09:21:46.972218 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:46Z","lastTransitionTime":"2025-12-02T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.074244 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.074284 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.074296 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.074313 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.074326 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:47Z","lastTransitionTime":"2025-12-02T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.176241 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.176273 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.176284 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.176299 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.176311 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:47Z","lastTransitionTime":"2025-12-02T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.278496 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.278544 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.278557 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.278574 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.278585 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:47Z","lastTransitionTime":"2025-12-02T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.381946 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.381988 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.381997 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.382013 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.382022 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:47Z","lastTransitionTime":"2025-12-02T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.484543 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.484592 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.484603 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.484620 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.484635 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:47Z","lastTransitionTime":"2025-12-02T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.498935 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:47 crc kubenswrapper[4781]: E1202 09:21:47.499074 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.510757 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaa19b0e-b6c3-4d56-b32d-a992eb6d3773\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5775ca067ce9fccaf09f228bfcf365fd53919bf4693f5dd7943183d1f14177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c426b8000b17c761c86ae3152bb08084f63533d41a01fa039f84f0a74b89499e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e25c4d1777f7fbfc56d52dac0ec8b209807c486a294c0e7062a49e58edb520b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.525837 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.540218 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.555486 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.568846 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.586355 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.586385 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.586426 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.586439 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.586447 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:47Z","lastTransitionTime":"2025-12-02T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.587456 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.599281 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.620038 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:22Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 09:21:22.385723 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:22.385745 6456 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:22.385782 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:22.385800 6456 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 09:21:22.385814 6456 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:22.385845 6456 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:22.385855 6456 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 09:21:22.385863 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:22.385868 6456 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:22.385869 6456 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:22.385878 6456 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:22.385879 6456 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:22.385889 6456 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:22.385894 6456 factory.go:656] Stopping watch factory\\\\nI1202 09:21:22.385908 6456 ovnkube.go:599] Stopped ovnkube\\\\nI1202 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.630358 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.649472 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.664278 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.676575 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.689115 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.689159 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.689170 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.689187 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.689199 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:47Z","lastTransitionTime":"2025-12-02T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.689890 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.701899 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.715470 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.730493 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a029cea19e689697b3e833a0bd6c745475464429e227868f4cabed7c7fe3ec71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:43Z\\\",\\\"message\\\":\\\"2025-12-02T09:20:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3fddf671-788e-42f6-aaea-cc2b46cff00f\\\\n2025-12-02T09:20:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3fddf671-788e-42f6-aaea-cc2b46cff00f to /host/opt/cni/bin/\\\\n2025-12-02T09:20:58Z [verbose] multus-daemon started\\\\n2025-12-02T09:20:58Z [verbose] Readiness Indicator file check\\\\n2025-12-02T09:21:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.740213 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q792g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdae8ff-3e82-4785-b958-a98717a14787\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q792g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.755854 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:47Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.791493 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.791527 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.791539 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.791554 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.791565 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:47Z","lastTransitionTime":"2025-12-02T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.893276 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.893498 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.893587 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.893657 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.893741 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:47Z","lastTransitionTime":"2025-12-02T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.997534 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.997595 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.997612 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.997636 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:47 crc kubenswrapper[4781]: I1202 09:21:47.997653 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:47Z","lastTransitionTime":"2025-12-02T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.099825 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.099913 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.100295 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.100696 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.100763 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:48Z","lastTransitionTime":"2025-12-02T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.203740 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.203775 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.203784 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.203797 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.203805 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:48Z","lastTransitionTime":"2025-12-02T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.306519 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.306555 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.306564 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.306578 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.306586 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:48Z","lastTransitionTime":"2025-12-02T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.408698 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.408736 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.408744 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.408759 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.408769 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:48Z","lastTransitionTime":"2025-12-02T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.499352 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.499352 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:48 crc kubenswrapper[4781]: E1202 09:21:48.499701 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:48 crc kubenswrapper[4781]: E1202 09:21:48.499820 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.499377 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:48 crc kubenswrapper[4781]: E1202 09:21:48.500134 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.511190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.511230 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.511241 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.511259 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.511271 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:48Z","lastTransitionTime":"2025-12-02T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.613960 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.614020 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.614040 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.614070 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.614090 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:48Z","lastTransitionTime":"2025-12-02T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.717390 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.717463 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.717487 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.717514 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.717535 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:48Z","lastTransitionTime":"2025-12-02T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.820575 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.820617 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.820625 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.820643 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.820654 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:48Z","lastTransitionTime":"2025-12-02T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.923189 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.923491 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.923579 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.923663 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:48 crc kubenswrapper[4781]: I1202 09:21:48.923749 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:48Z","lastTransitionTime":"2025-12-02T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.026359 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.026470 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.026489 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.026512 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.026529 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:49Z","lastTransitionTime":"2025-12-02T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.130207 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.130280 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.130300 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.130324 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.130349 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:49Z","lastTransitionTime":"2025-12-02T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.233021 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.233054 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.233065 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.233084 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.233095 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:49Z","lastTransitionTime":"2025-12-02T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.336473 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.336518 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.336526 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.336540 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.336550 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:49Z","lastTransitionTime":"2025-12-02T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.438632 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.438667 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.438679 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.438695 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.438708 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:49Z","lastTransitionTime":"2025-12-02T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.499673 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:49 crc kubenswrapper[4781]: E1202 09:21:49.499902 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.540682 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.540710 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.540718 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.540737 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.540745 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:49Z","lastTransitionTime":"2025-12-02T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.643034 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.643088 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.643105 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.643127 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.643143 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:49Z","lastTransitionTime":"2025-12-02T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.745018 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.745061 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.745073 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.745089 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.745101 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:49Z","lastTransitionTime":"2025-12-02T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.847369 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.847409 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.847431 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.847465 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.847476 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:49Z","lastTransitionTime":"2025-12-02T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.949904 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.950378 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.950584 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.950811 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:49 crc kubenswrapper[4781]: I1202 09:21:49.951073 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:49Z","lastTransitionTime":"2025-12-02T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.053681 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.053982 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.054082 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.054192 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.054276 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:50Z","lastTransitionTime":"2025-12-02T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.156881 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.156962 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.156983 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.157006 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.157026 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:50Z","lastTransitionTime":"2025-12-02T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.259684 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.259710 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.259718 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.259749 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.259759 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:50Z","lastTransitionTime":"2025-12-02T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.293550 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.293594 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.293610 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.293629 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.293644 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:50Z","lastTransitionTime":"2025-12-02T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:50 crc kubenswrapper[4781]: E1202 09:21:50.313150 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:50Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.317522 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.317570 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.317586 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.317607 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.317623 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:50Z","lastTransitionTime":"2025-12-02T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:50 crc kubenswrapper[4781]: E1202 09:21:50.332623 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:50Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.336259 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.336278 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.336286 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.336297 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.336323 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:50Z","lastTransitionTime":"2025-12-02T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:50 crc kubenswrapper[4781]: E1202 09:21:50.347311 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:50Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.351147 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.351174 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.351186 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.351202 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.351213 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:50Z","lastTransitionTime":"2025-12-02T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:50 crc kubenswrapper[4781]: E1202 09:21:50.372943 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:50Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.378775 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.378821 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.378833 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.378851 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.378863 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:50Z","lastTransitionTime":"2025-12-02T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:50 crc kubenswrapper[4781]: E1202 09:21:50.411005 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:50Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:50 crc kubenswrapper[4781]: E1202 09:21:50.411186 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.412867 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.412910 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.412956 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.412974 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.412990 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:50Z","lastTransitionTime":"2025-12-02T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.498891 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.498951 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.498989 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:50 crc kubenswrapper[4781]: E1202 09:21:50.499043 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:50 crc kubenswrapper[4781]: E1202 09:21:50.499295 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:50 crc kubenswrapper[4781]: E1202 09:21:50.499829 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.500431 4781 scope.go:117] "RemoveContainer" containerID="c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.515615 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.515685 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.515704 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.515740 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.515759 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:50Z","lastTransitionTime":"2025-12-02T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.618775 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.618813 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.618821 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.618837 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.618847 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:50Z","lastTransitionTime":"2025-12-02T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.721265 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.721324 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.721342 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.721365 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.721383 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:50Z","lastTransitionTime":"2025-12-02T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.824202 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.824231 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.824239 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.824252 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.824260 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:50Z","lastTransitionTime":"2025-12-02T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.927320 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.927368 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.927383 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.927405 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:50 crc kubenswrapper[4781]: I1202 09:21:50.927420 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:50Z","lastTransitionTime":"2025-12-02T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.030451 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.030484 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.030501 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.030521 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.030536 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:51Z","lastTransitionTime":"2025-12-02T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.132937 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.132982 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.132995 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.133012 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.133026 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:51Z","lastTransitionTime":"2025-12-02T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.235041 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.235080 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.235089 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.235102 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.235112 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:51Z","lastTransitionTime":"2025-12-02T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.337588 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.337623 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.337632 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.337646 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.337656 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:51Z","lastTransitionTime":"2025-12-02T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.440429 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.440473 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.440484 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.440501 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.440513 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:51Z","lastTransitionTime":"2025-12-02T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.498876 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:51 crc kubenswrapper[4781]: E1202 09:21:51.499031 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.542248 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.542282 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.542291 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.542305 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.542313 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:51Z","lastTransitionTime":"2025-12-02T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.643994 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.644028 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.644036 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.644048 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.644057 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:51Z","lastTransitionTime":"2025-12-02T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.746519 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.746555 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.746564 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.746578 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.746587 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:51Z","lastTransitionTime":"2025-12-02T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.849584 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.849635 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.849650 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.849672 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.849688 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:51Z","lastTransitionTime":"2025-12-02T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.883441 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovnkube-controller/2.log" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.887603 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerStarted","Data":"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d"} Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.888260 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.921395 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:22Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 09:21:22.385723 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:22.385745 6456 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:22.385782 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:22.385800 6456 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 09:21:22.385814 6456 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:22.385845 6456 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:22.385855 6456 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 09:21:22.385863 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:22.385868 6456 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:22.385869 6456 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:22.385878 6456 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:22.385879 6456 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:22.385889 6456 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:22.385894 6456 factory.go:656] Stopping watch factory\\\\nI1202 09:21:22.385908 6456 ovnkube.go:599] Stopped ovnkube\\\\nI1202 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:51Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.936584 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:51Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.952352 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.952414 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.952425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.952461 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.952474 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:51Z","lastTransitionTime":"2025-12-02T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.956202 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:51Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.966842 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:51Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.978357 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:51Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:51 crc kubenswrapper[4781]: I1202 09:21:51.993532 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:51Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.011475 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:52Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.026500 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:52Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.041194 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a029cea19e689697b3e833a0bd6c745475464429e227868f4cabed7c7fe3ec71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:43Z\\\",\\\"message\\\":\\\"2025-12-02T09:20:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3fddf671-788e-42f6-aaea-cc2b46cff00f\\\\n2025-12-02T09:20:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3fddf671-788e-42f6-aaea-cc2b46cff00f to /host/opt/cni/bin/\\\\n2025-12-02T09:20:58Z [verbose] multus-daemon started\\\\n2025-12-02T09:20:58Z [verbose] Readiness Indicator file check\\\\n2025-12-02T09:21:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:52Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.054332 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q792g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdae8ff-3e82-4785-b958-a98717a14787\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q792g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:52Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.055256 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.055289 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.055320 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.055339 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.055350 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:52Z","lastTransitionTime":"2025-12-02T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.071066 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:52Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.086304 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:52Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.099011 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:52Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.114537 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:52Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.125348 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:52Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.136978 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:52Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.157546 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:52Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.159415 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.159456 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.159466 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.159482 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.159494 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:52Z","lastTransitionTime":"2025-12-02T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.178440 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaa19b0e-b6c3-4d56-b32d-a992eb6d3773\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5775ca067ce9fccaf09f228bfcf365fd53919bf4693f5dd7943183d1f14177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c426b8000b17c761c86ae3152bb08084f63533d41a01fa039f84f0a74b89499e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e25c4d1777f7fbfc56d52dac0ec8b209807c486a294c0e7062a49e58edb520b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:52Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.262604 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.262651 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.262664 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.262681 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.262693 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:52Z","lastTransitionTime":"2025-12-02T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.364740 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.364779 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.364790 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.364806 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.364817 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:52Z","lastTransitionTime":"2025-12-02T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.467704 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.467761 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.467783 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.467805 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.467819 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:52Z","lastTransitionTime":"2025-12-02T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.499582 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.499612 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.499642 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:52 crc kubenswrapper[4781]: E1202 09:21:52.499754 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:52 crc kubenswrapper[4781]: E1202 09:21:52.499878 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:52 crc kubenswrapper[4781]: E1202 09:21:52.500041 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.571048 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.571091 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.571102 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.571116 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.571125 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:52Z","lastTransitionTime":"2025-12-02T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.673700 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.673762 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.673779 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.673801 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.673818 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:52Z","lastTransitionTime":"2025-12-02T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.777435 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.777509 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.777530 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.777557 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.777579 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:52Z","lastTransitionTime":"2025-12-02T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.880118 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.880154 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.880163 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.880175 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.880186 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:52Z","lastTransitionTime":"2025-12-02T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.893609 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovnkube-controller/3.log" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.894259 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovnkube-controller/2.log" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.898519 4781 generic.go:334] "Generic (PLEG): container finished" podID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerID="355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d" exitCode=1 Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.898548 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerDied","Data":"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d"} Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.898604 4781 scope.go:117] "RemoveContainer" containerID="c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.900176 4781 scope.go:117] "RemoveContainer" containerID="355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d" Dec 02 09:21:52 crc kubenswrapper[4781]: E1202 09:21:52.900587 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.924839 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:52Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.938590 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:52Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.960226 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16ede9db49d80344fd2b60c39bf00bcf9dbeabd8918656a55230925c6a6cb60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:22Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 09:21:22.385723 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 09:21:22.385745 6456 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 09:21:22.385782 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 09:21:22.385800 6456 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 09:21:22.385814 6456 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 09:21:22.385845 6456 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 09:21:22.385855 6456 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 09:21:22.385863 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 09:21:22.385868 6456 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 09:21:22.385869 6456 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 09:21:22.385878 6456 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 09:21:22.385879 6456 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 09:21:22.385889 6456 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 09:21:22.385894 6456 factory.go:656] Stopping watch factory\\\\nI1202 09:21:22.385908 6456 ovnkube.go:599] Stopped ovnkube\\\\nI1202 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:51Z\\\",\\\"message\\\":\\\"tart default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:51Z is after 2025-08-24T17:21:41Z]\\\\nI1202 09:21:51.613500 6815 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1202 09:21:51.613511 6815 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1202 09:21:51.613473 6815 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1202 09:21:51.613517 6815 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1202 09:21:51.613522 6815 default_network_controller.go:776] Recording success event o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:52Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.973562 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:52Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.984097 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.984136 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.984145 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.984158 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.984168 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:52Z","lastTransitionTime":"2025-12-02T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:52 crc kubenswrapper[4781]: I1202 09:21:52.989591 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:52Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.009793 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.032626 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.055341 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.072262 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.086210 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.086231 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.086239 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.086251 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.086260 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:53Z","lastTransitionTime":"2025-12-02T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.093849 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.115412 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a029cea19e689697b3e833a0bd6c745475464429e227868f4cabed7c7fe3ec71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:43Z\\\",\\\"message\\\":\\\"2025-12-02T09:20:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3fddf671-788e-42f6-aaea-cc2b46cff00f\\\\n2025-12-02T09:20:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3fddf671-788e-42f6-aaea-cc2b46cff00f to /host/opt/cni/bin/\\\\n2025-12-02T09:20:58Z [verbose] multus-daemon started\\\\n2025-12-02T09:20:58Z [verbose] Readiness Indicator file check\\\\n2025-12-02T09:21:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.129712 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q792g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdae8ff-3e82-4785-b958-a98717a14787\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q792g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.142250 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.164520 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.176201 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaa19b0e-b6c3-4d56-b32d-a992eb6d3773\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5775ca067ce9fccaf09f228bfcf365fd53919bf4693f5dd7943183d1f14177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c426b8000b17c761c86ae3152bb08084f63533d41a01fa039f84f0a74b89499e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e25c4d1777f7fbfc56d52dac0ec8b209807c486a294c0e7062a49e58edb520b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.188468 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.188501 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.188591 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.188605 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.188615 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:53Z","lastTransitionTime":"2025-12-02T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.189355 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.200956 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.211719 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.290735 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.290762 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.290770 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.290784 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.290795 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:53Z","lastTransitionTime":"2025-12-02T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.393621 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.393692 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.394381 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.394443 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.394461 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:53Z","lastTransitionTime":"2025-12-02T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.497458 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.497528 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.497541 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.497558 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.497570 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:53Z","lastTransitionTime":"2025-12-02T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.499229 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:53 crc kubenswrapper[4781]: E1202 09:21:53.499361 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.600505 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.600742 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.600757 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.600776 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.600790 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:53Z","lastTransitionTime":"2025-12-02T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.703220 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.703258 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.703270 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.703295 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.703308 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:53Z","lastTransitionTime":"2025-12-02T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.806267 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.806303 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.806312 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.806327 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.806337 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:53Z","lastTransitionTime":"2025-12-02T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.903693 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovnkube-controller/3.log" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.908127 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.908167 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.908178 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.908197 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.908209 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:53Z","lastTransitionTime":"2025-12-02T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.908579 4781 scope.go:117] "RemoveContainer" containerID="355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d" Dec 02 09:21:53 crc kubenswrapper[4781]: E1202 09:21:53.908867 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.926877 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.941030 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.955777 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.982544 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:53 crc kubenswrapper[4781]: I1202 09:21:53.998081 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaa19b0e-b6c3-4d56-b32d-a992eb6d3773\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5775ca067ce9fccaf09f228bfcf365fd53919bf4693f5dd7943183d1f14177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c426b8000b17c761c86ae3152bb08084f63533d41a01fa039f84f0a74b89499e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e25c4d1777f7fbfc56d52dac0ec8b209807c486a294c0e7062a49e58edb520b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:53Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.011061 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.011126 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.011142 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.011165 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.011181 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:54Z","lastTransitionTime":"2025-12-02T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.019093 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:54Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.037627 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:54Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.049954 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:54Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.079462 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:51Z\\\",\\\"message\\\":\\\"tart default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:51Z is after 2025-08-24T17:21:41Z]\\\\nI1202 09:21:51.613500 6815 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1202 09:21:51.613511 6815 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1202 09:21:51.613473 6815 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1202 09:21:51.613517 6815 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1202 09:21:51.613522 6815 default_network_controller.go:776] Recording success event o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:54Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.091129 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:54Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.106740 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:54Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.114852 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.114884 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.114894 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.114908 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.114917 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:54Z","lastTransitionTime":"2025-12-02T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.119372 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:54Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.164524 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a029cea19e689697b3e833a0bd6c745475464429e227868f4cabed7c7fe3ec71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:43Z\\\",\\\"message\\\":\\\"2025-12-02T09:20:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3fddf671-788e-42f6-aaea-cc2b46cff00f\\\\n2025-12-02T09:20:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3fddf671-788e-42f6-aaea-cc2b46cff00f to /host/opt/cni/bin/\\\\n2025-12-02T09:20:58Z [verbose] multus-daemon started\\\\n2025-12-02T09:20:58Z [verbose] Readiness Indicator file check\\\\n2025-12-02T09:21:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:54Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.176698 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q792g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdae8ff-3e82-4785-b958-a98717a14787\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q792g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:54Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.190670 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:54Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.202259 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:54Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.212487 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:54Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.221438 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.221502 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.221522 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.221545 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.221562 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:54Z","lastTransitionTime":"2025-12-02T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.224678 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:54Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.324549 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.324585 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.324595 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.324612 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.324623 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:54Z","lastTransitionTime":"2025-12-02T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.427517 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.427562 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.427572 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.427588 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.427601 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:54Z","lastTransitionTime":"2025-12-02T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.499690 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.499726 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.499698 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:54 crc kubenswrapper[4781]: E1202 09:21:54.499874 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:54 crc kubenswrapper[4781]: E1202 09:21:54.500040 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:54 crc kubenswrapper[4781]: E1202 09:21:54.500130 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.529537 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.529578 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.529589 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.529603 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.529614 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:54Z","lastTransitionTime":"2025-12-02T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.632691 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.632756 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.632776 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.632800 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.632818 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:54Z","lastTransitionTime":"2025-12-02T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.735474 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.735536 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.735651 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.735680 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.735701 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:54Z","lastTransitionTime":"2025-12-02T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.839001 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.839068 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.839087 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.839111 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.839129 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:54Z","lastTransitionTime":"2025-12-02T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.941745 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.941813 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.941837 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.941869 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:54 crc kubenswrapper[4781]: I1202 09:21:54.941895 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:54Z","lastTransitionTime":"2025-12-02T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.044377 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.044436 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.044458 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.044485 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.044509 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:55Z","lastTransitionTime":"2025-12-02T09:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.147264 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.147306 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.147318 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.147336 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.147349 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:55Z","lastTransitionTime":"2025-12-02T09:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.249700 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.249749 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.249761 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.249780 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.249791 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:55Z","lastTransitionTime":"2025-12-02T09:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.353024 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.353129 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.353193 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.353221 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.353282 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:55Z","lastTransitionTime":"2025-12-02T09:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.456628 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.456679 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.456695 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.456717 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.456733 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:55Z","lastTransitionTime":"2025-12-02T09:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.499443 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:55 crc kubenswrapper[4781]: E1202 09:21:55.499717 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.560032 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.560066 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.560079 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.560096 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.560107 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:55Z","lastTransitionTime":"2025-12-02T09:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.662290 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.662322 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.662344 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.662358 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.662367 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:55Z","lastTransitionTime":"2025-12-02T09:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.765844 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.765904 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.765916 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.765943 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.765956 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:55Z","lastTransitionTime":"2025-12-02T09:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.869453 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.869519 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.869535 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.869560 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.869580 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:55Z","lastTransitionTime":"2025-12-02T09:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.971540 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.971845 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.971940 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.972017 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:55 crc kubenswrapper[4781]: I1202 09:21:55.972081 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:55Z","lastTransitionTime":"2025-12-02T09:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.076322 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.076405 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.076431 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.076460 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.076482 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:56Z","lastTransitionTime":"2025-12-02T09:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.178466 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.178513 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.178524 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.178544 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.178556 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:56Z","lastTransitionTime":"2025-12-02T09:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.280853 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.281248 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.281320 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.281394 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.281479 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:56Z","lastTransitionTime":"2025-12-02T09:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.382972 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.383004 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.383014 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.383044 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.383054 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:56Z","lastTransitionTime":"2025-12-02T09:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.484813 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.484852 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.484860 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.484876 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.484885 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:56Z","lastTransitionTime":"2025-12-02T09:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.499227 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.499249 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.499280 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:56 crc kubenswrapper[4781]: E1202 09:21:56.499724 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:56 crc kubenswrapper[4781]: E1202 09:21:56.499718 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:56 crc kubenswrapper[4781]: E1202 09:21:56.499878 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.587887 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.588250 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.588401 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.588559 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.588688 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:56Z","lastTransitionTime":"2025-12-02T09:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.692200 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.692532 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.692682 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.692822 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.693000 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:56Z","lastTransitionTime":"2025-12-02T09:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.796238 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.796332 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.796355 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.796385 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.796409 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:56Z","lastTransitionTime":"2025-12-02T09:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.900321 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.902249 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.902506 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.903207 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:56 crc kubenswrapper[4781]: I1202 09:21:56.903660 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:56Z","lastTransitionTime":"2025-12-02T09:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.007208 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.007289 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.007331 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.007355 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.007369 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:57Z","lastTransitionTime":"2025-12-02T09:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.112010 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.112072 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.112085 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.112104 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.112116 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:57Z","lastTransitionTime":"2025-12-02T09:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.215447 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.215515 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.215529 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.215553 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.215566 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:57Z","lastTransitionTime":"2025-12-02T09:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.320134 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.320505 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.320643 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.320782 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.320948 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:57Z","lastTransitionTime":"2025-12-02T09:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.424339 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.424403 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.424422 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.424439 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.424458 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:57Z","lastTransitionTime":"2025-12-02T09:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.500356 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:57 crc kubenswrapper[4781]: E1202 09:21:57.500604 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.520563 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c464aa1e840c3a97b63b6c760fca6c21e7a182269a602a688c70525b4bbb39a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.528968 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.529030 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.529050 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.529076 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.529097 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:57Z","lastTransitionTime":"2025-12-02T09:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.545044 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a3a44e4812364947f0dd8eed3d3f7252d6e37e5057b9f180eca5b8afc472d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9866ea1416094cbfe3f1e850261118f82b1d977589862e7f59e13b33fc24295a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.561528 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e10258da-dad3-4df8-82c2-9d9438493a3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb3264562e09d949d9dcd64e52ee1fc0825681adc0d91f25505fdefd0a807ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bph4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pzntm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.577657 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ffk9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04fe11ee-efe6-4b10-a638-021e53367e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6a45fdfe3335bc38eb30705eca6aaf887b5cd9fac235b5350832bff6df8cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46c6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ffk9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.606428 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa24d58e-64e7-4469-9611-3dab9c142594\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7af0caf9bd11c42c48bbf4dcd4e719840d2e2dbce55ec205dd2cc103b71772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7035e3038f33a299c00380bc79d4315ec612bd6f4009d55258f02b5c3ec70e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://274dd3f5c42e9217f2a35c99709567f617ef1def1cbd5110cfdc8b5e88bce7ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8beb17df1dddccb32b1d93f03fd20551048bc1d150615e18bf73c1d364d7bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1936331547e3941bdaae5c90f12874b6fda39f3b6b67f678156915473cc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77054ffcfe864327558efeab91c1e15a99bfef4cef26960e27464317a229bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d5b54c90f98cb81fb440da5ea8186b2f0b1c7a5063c1b4f9e38c72458a0a67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab441e13cadbafc7f145c2648f02bf5df8ee405021318826e605bc7050533c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.626463 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaa19b0e-b6c3-4d56-b32d-a992eb6d3773\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5775ca067ce9fccaf09f228bfcf365fd53919bf4693f5dd7943183d1f14177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c426b8000b17c761c86ae3152bb08084f63533d41a01fa039f84f0a74b89499e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e25c4d1777f7fbfc56d52dac0ec8b209807c486a294c0e7062a49e58edb520b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae274f3b640c3661df1109d91d4ea85e85568a686ec292397695e2a7b8703dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.632686 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.632752 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.632774 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.632807 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.632828 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:57Z","lastTransitionTime":"2025-12-02T09:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.648370 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be967da7-3e7f-47e6-9d54-408ae99531a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30e09e5a7f3200cca709bf3437af0ce93cf039451c894bdfd813ae35d3c70c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735c6d3ed169d7e11f8f6cc0d99fc415020928f852a554288c6c05f3dc99bb3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzmcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7k2h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.668085 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba607959-82d4-440b-b830-95eb9584db6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW1202 09:20:56.232387 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 09:20:56.232507 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 09:20:56.233123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3642890392/tls.crt::/tmp/serving-cert-3642890392/tls.key\\\\\\\"\\\\nI1202 09:20:56.583994 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 09:20:56.586003 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 09:20:56.586021 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 09:20:56.586041 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 09:20:56.586046 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 09:20:56.590021 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 09:20:56.590050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590054 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 09:20:56.590062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 09:20:56.590065 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 09:20:56.590068 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 09:20:56.590073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 09:20:56.590248 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 09:20:56.592236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.687435 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kgbn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9440f972-ed59-4852-a180-3d5a2111f966\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7eda6361552bc105ca2a6cc3444d96fbf79b10eff612e87aa22c1559fa8c90fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hxs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kgbn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.710383 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:51Z\\\",\\\"message\\\":\\\"tart default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:51Z is after 2025-08-24T17:21:41Z]\\\\nI1202 09:21:51.613500 6815 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1202 09:21:51.613511 6815 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1202 09:21:51.613473 6815 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1202 09:21:51.613517 6815 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1202 09:21:51.613522 6815 default_network_controller.go:776] Recording success event o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxmlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5x7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.730432 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.735482 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.735542 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.735555 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.735578 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.735594 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:57Z","lastTransitionTime":"2025-12-02T09:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.746288 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.770304 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aa44429-873b-48f6-bc13-55745827d8fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1586e0f941724ff8afcd4030d43ebb517e0782ece160fe634ed828af68b1965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66eaf6eafe4b882dd4b8e17474498eb2cb566f761a099dce0e3ec9c40d068a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f1114dff9f3c131c5ca1dd10ee2cc2ecf95a7355196a7da619486a3028d3f1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2e268ff888ba7c23ae6555d431db904d871550d5e591cba785618c397d011f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a650e19e523d6e4eaa7c69a35ed12741e9875055a0ccdd4a00355c27ab2e91a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea70781b9498a2473d55bec42b18460d4072c3b001c725999fff1e049242e3be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed8f2ce6a5e4507a5960916fb145329d0a2ad6cf1e7b8d646dd3aad86ac4cbca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T09:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T09:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9svf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dnkgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.787670 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8b6p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a029cea19e689697b3e833a0bd6c745475464429e227868f4cabed7c7fe3ec71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T09:21:43Z\\\",\\\"message\\\":\\\"2025-12-02T09:20:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3fddf671-788e-42f6-aaea-cc2b46cff00f\\\\n2025-12-02T09:20:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3fddf671-788e-42f6-aaea-cc2b46cff00f to /host/opt/cni/bin/\\\\n2025-12-02T09:20:58Z [verbose] multus-daemon started\\\\n2025-12-02T09:20:58Z [verbose] Readiness Indicator file check\\\\n2025-12-02T09:21:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxbk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8b6p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.801975 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q792g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdae8ff-3e82-4785-b958-a98717a14787\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:21:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vl4xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:21:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q792g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.817458 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8157f9-9d69-41d0-8158-f5d166743724\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a36c467812da990b1678a8465bf25d97c4d8226e88ed27eea452b35a75c5dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f99f962bf9ffb2475be15e4fe8d41c3fe3c870d092a50b96a19fe9c759462ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc9e488e47baf2860cd0133d661714b47495618ce410ee8ad0fa9292da3cfdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T09:20:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.832481 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.837763 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.837813 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.837827 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.837847 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.837864 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:57Z","lastTransitionTime":"2025-12-02T09:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.850456 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T09:20:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://822d2ca8c4fb83c643f704e9acc8e4034235d2d1e4730a9495aa0a6a8df9979e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T09:20:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:21:57Z is after 2025-08-24T17:21:41Z" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.940716 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.940772 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.940785 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.940806 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:57 crc kubenswrapper[4781]: I1202 09:21:57.940819 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:57Z","lastTransitionTime":"2025-12-02T09:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.042914 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.042980 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.042994 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.043010 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.043023 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:58Z","lastTransitionTime":"2025-12-02T09:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.145562 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.145639 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.145663 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.145698 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.145724 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:58Z","lastTransitionTime":"2025-12-02T09:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.248912 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.248981 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.248998 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.249018 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.249034 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:58Z","lastTransitionTime":"2025-12-02T09:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.350854 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.350970 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.350984 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.351002 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.351043 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:58Z","lastTransitionTime":"2025-12-02T09:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.453776 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.453844 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.453869 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.453898 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.453957 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:58Z","lastTransitionTime":"2025-12-02T09:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.498777 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.498851 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:21:58 crc kubenswrapper[4781]: E1202 09:21:58.498935 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.498992 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:58 crc kubenswrapper[4781]: E1202 09:21:58.499073 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:21:58 crc kubenswrapper[4781]: E1202 09:21:58.499184 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.557072 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.557147 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.557166 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.557190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.557223 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:58Z","lastTransitionTime":"2025-12-02T09:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.659502 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.659541 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.659553 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.659569 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.659581 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:58Z","lastTransitionTime":"2025-12-02T09:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.762290 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.762336 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.762352 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.762372 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.762385 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:58Z","lastTransitionTime":"2025-12-02T09:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.864721 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.864781 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.864798 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.864824 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.864841 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:58Z","lastTransitionTime":"2025-12-02T09:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.967458 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.967493 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.967505 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.967518 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:58 crc kubenswrapper[4781]: I1202 09:21:58.967526 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:58Z","lastTransitionTime":"2025-12-02T09:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.070961 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.071004 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.071013 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.071026 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.071044 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:59Z","lastTransitionTime":"2025-12-02T09:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.173907 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.173952 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.173961 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.173974 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.173983 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:59Z","lastTransitionTime":"2025-12-02T09:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.276235 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.276285 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.276294 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.276309 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.276319 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:59Z","lastTransitionTime":"2025-12-02T09:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.379157 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.379190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.379201 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.379236 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.379248 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:59Z","lastTransitionTime":"2025-12-02T09:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.461570 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.461742 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:59 crc kubenswrapper[4781]: E1202 09:21:59.461792 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:03.461757381 +0000 UTC m=+146.285631270 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.461850 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.461960 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:59 crc kubenswrapper[4781]: E1202 09:21:59.461968 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 09:21:59 crc kubenswrapper[4781]: E1202 09:21:59.462001 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 09:21:59 crc kubenswrapper[4781]: E1202 09:21:59.462025 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.462033 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:21:59 crc kubenswrapper[4781]: E1202 09:21:59.462108 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 09:23:03.462081149 +0000 UTC m=+146.285955068 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:21:59 crc kubenswrapper[4781]: E1202 09:21:59.462185 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 09:21:59 crc kubenswrapper[4781]: E1202 09:21:59.462213 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 09:21:59 crc kubenswrapper[4781]: E1202 09:21:59.462238 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 09:21:59 crc kubenswrapper[4781]: E1202 09:21:59.462259 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 09:21:59 crc kubenswrapper[4781]: E1202 09:21:59.462280 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:21:59 crc kubenswrapper[4781]: E1202 09:21:59.462236 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 09:23:03.462224672 +0000 UTC m=+146.286098641 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 09:21:59 crc kubenswrapper[4781]: E1202 09:21:59.462341 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 09:23:03.462321604 +0000 UTC m=+146.286195553 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 09:21:59 crc kubenswrapper[4781]: E1202 09:21:59.462363 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 09:23:03.462352565 +0000 UTC m=+146.286226574 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.482140 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.482180 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.482190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.482206 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.482217 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:59Z","lastTransitionTime":"2025-12-02T09:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.498717 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:21:59 crc kubenswrapper[4781]: E1202 09:21:59.498897 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.584542 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.584596 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.584616 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.584638 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.584655 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:59Z","lastTransitionTime":"2025-12-02T09:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.688075 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.688138 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.688160 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.688190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.688211 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:59Z","lastTransitionTime":"2025-12-02T09:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.790880 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.790910 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.790938 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.790952 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.790961 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:59Z","lastTransitionTime":"2025-12-02T09:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.893403 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.893509 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.893522 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.893540 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.893551 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:59Z","lastTransitionTime":"2025-12-02T09:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.995245 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.995306 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.995324 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.995356 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:21:59 crc kubenswrapper[4781]: I1202 09:21:59.995382 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:21:59Z","lastTransitionTime":"2025-12-02T09:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.098391 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.098470 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.098494 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.098525 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.098547 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:00Z","lastTransitionTime":"2025-12-02T09:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.201920 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.202041 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.202067 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.202098 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.202121 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:00Z","lastTransitionTime":"2025-12-02T09:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.305425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.305459 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.305470 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.305488 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.305499 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:00Z","lastTransitionTime":"2025-12-02T09:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.408977 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.409036 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.409048 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.409071 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.409084 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:00Z","lastTransitionTime":"2025-12-02T09:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.499573 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.499613 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.499676 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:00 crc kubenswrapper[4781]: E1202 09:22:00.499757 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:00 crc kubenswrapper[4781]: E1202 09:22:00.499912 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:00 crc kubenswrapper[4781]: E1202 09:22:00.500092 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.512313 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.512393 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.512410 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.512433 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.512449 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:00Z","lastTransitionTime":"2025-12-02T09:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.615799 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.615869 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.615889 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.615914 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.615960 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:00Z","lastTransitionTime":"2025-12-02T09:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.718918 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.719026 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.719045 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.719072 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.719098 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:00Z","lastTransitionTime":"2025-12-02T09:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.746193 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.746256 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.746273 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.746297 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.746314 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:00Z","lastTransitionTime":"2025-12-02T09:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:00 crc kubenswrapper[4781]: E1202 09:22:00.760835 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:22:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.766074 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.766130 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.766143 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.766160 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.766172 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:00Z","lastTransitionTime":"2025-12-02T09:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:00 crc kubenswrapper[4781]: E1202 09:22:00.785562 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:22:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.790684 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.790754 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.790837 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.790870 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.790894 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:00Z","lastTransitionTime":"2025-12-02T09:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:00 crc kubenswrapper[4781]: E1202 09:22:00.811312 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:22:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.815558 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.815589 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.815601 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.815615 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.815625 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:00Z","lastTransitionTime":"2025-12-02T09:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:00 crc kubenswrapper[4781]: E1202 09:22:00.834113 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:22:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.838391 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.838532 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.838551 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.838574 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.838590 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:00Z","lastTransitionTime":"2025-12-02T09:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:00 crc kubenswrapper[4781]: E1202 09:22:00.852051 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T09:22:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8aab02d-934e-4a61-8d03-a223ac62150b\\\",\\\"systemUUID\\\":\\\"c8f9d245-0a2e-447a-a09e-ff80f79ba02f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T09:22:00Z is after 2025-08-24T17:21:41Z" Dec 02 09:22:00 crc kubenswrapper[4781]: E1202 09:22:00.852279 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.853528 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.853579 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.853596 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.853618 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.853635 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:00Z","lastTransitionTime":"2025-12-02T09:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.957396 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.957460 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.957477 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.957503 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:00 crc kubenswrapper[4781]: I1202 09:22:00.957522 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:00Z","lastTransitionTime":"2025-12-02T09:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.060148 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.060270 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.060289 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.060316 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.060335 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:01Z","lastTransitionTime":"2025-12-02T09:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.163862 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.163988 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.164014 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.164048 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.164082 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:01Z","lastTransitionTime":"2025-12-02T09:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.266496 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.266640 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.266668 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.266702 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.266727 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:01Z","lastTransitionTime":"2025-12-02T09:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.370239 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.370301 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.370319 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.370346 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.370370 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:01Z","lastTransitionTime":"2025-12-02T09:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.473660 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.473741 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.473760 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.473793 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.473814 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:01Z","lastTransitionTime":"2025-12-02T09:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.501080 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:01 crc kubenswrapper[4781]: E1202 09:22:01.501421 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.575980 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.576015 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.576024 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.576038 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.576047 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:01Z","lastTransitionTime":"2025-12-02T09:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.678615 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.678696 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.678727 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.678761 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.678784 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:01Z","lastTransitionTime":"2025-12-02T09:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.780911 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.780965 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.780975 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.780988 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.780998 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:01Z","lastTransitionTime":"2025-12-02T09:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.883516 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.883631 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.883654 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.883676 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.883696 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:01Z","lastTransitionTime":"2025-12-02T09:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.986403 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.986447 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.986458 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.986476 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:01 crc kubenswrapper[4781]: I1202 09:22:01.986490 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:01Z","lastTransitionTime":"2025-12-02T09:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.089260 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.089303 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.089312 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.089329 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.089340 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:02Z","lastTransitionTime":"2025-12-02T09:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.196326 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.196376 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.196389 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.196406 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.196418 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:02Z","lastTransitionTime":"2025-12-02T09:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.299504 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.299551 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.299567 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.299589 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.299606 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:02Z","lastTransitionTime":"2025-12-02T09:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.403062 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.403116 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.403135 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.403164 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.403181 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:02Z","lastTransitionTime":"2025-12-02T09:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.499459 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.500162 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.500310 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:02 crc kubenswrapper[4781]: E1202 09:22:02.500311 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:02 crc kubenswrapper[4781]: E1202 09:22:02.500410 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:02 crc kubenswrapper[4781]: E1202 09:22:02.500463 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.506291 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.506328 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.506343 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.506364 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.506380 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:02Z","lastTransitionTime":"2025-12-02T09:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.509227 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.610102 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.610152 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.610164 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.610182 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.610196 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:02Z","lastTransitionTime":"2025-12-02T09:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.712702 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.712763 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.712790 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.712817 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.712836 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:02Z","lastTransitionTime":"2025-12-02T09:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.815612 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.815667 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.815680 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.815698 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.815713 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:02Z","lastTransitionTime":"2025-12-02T09:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.917640 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.917690 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.917708 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.917731 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:02 crc kubenswrapper[4781]: I1202 09:22:02.917750 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:02Z","lastTransitionTime":"2025-12-02T09:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.020888 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.020994 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.021012 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.021035 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.021057 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:03Z","lastTransitionTime":"2025-12-02T09:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.124383 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.125029 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.125069 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.125094 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.125110 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:03Z","lastTransitionTime":"2025-12-02T09:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.227557 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.227643 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.227665 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.227694 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.227714 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:03Z","lastTransitionTime":"2025-12-02T09:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.331406 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.331466 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.331482 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.331504 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.331522 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:03Z","lastTransitionTime":"2025-12-02T09:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.434377 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.434446 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.434465 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.434490 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.434508 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:03Z","lastTransitionTime":"2025-12-02T09:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.499684 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:03 crc kubenswrapper[4781]: E1202 09:22:03.500112 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.537319 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.537366 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.537382 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.537403 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.537421 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:03Z","lastTransitionTime":"2025-12-02T09:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.640341 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.640404 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.640422 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.640450 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.640469 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:03Z","lastTransitionTime":"2025-12-02T09:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.743170 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.743235 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.743259 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.743292 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.743315 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:03Z","lastTransitionTime":"2025-12-02T09:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.846578 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.846638 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.846660 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.846692 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.846716 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:03Z","lastTransitionTime":"2025-12-02T09:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.949537 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.949607 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.949627 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.949652 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:03 crc kubenswrapper[4781]: I1202 09:22:03.949671 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:03Z","lastTransitionTime":"2025-12-02T09:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.052179 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.052257 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.052293 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.052330 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.052352 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:04Z","lastTransitionTime":"2025-12-02T09:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.155486 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.155556 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.155584 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.155614 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.155638 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:04Z","lastTransitionTime":"2025-12-02T09:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.258303 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.258346 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.258354 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.258370 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.258382 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:04Z","lastTransitionTime":"2025-12-02T09:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.360700 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.360752 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.360760 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.360775 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.360784 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:04Z","lastTransitionTime":"2025-12-02T09:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.468648 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.468728 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.468746 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.468772 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.468789 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:04Z","lastTransitionTime":"2025-12-02T09:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.498660 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.498723 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.498766 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:04 crc kubenswrapper[4781]: E1202 09:22:04.498910 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:04 crc kubenswrapper[4781]: E1202 09:22:04.499064 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:04 crc kubenswrapper[4781]: E1202 09:22:04.499170 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.571371 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.571415 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.571427 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.571440 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.571451 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:04Z","lastTransitionTime":"2025-12-02T09:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.673678 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.673715 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.673725 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.673741 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.673751 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:04Z","lastTransitionTime":"2025-12-02T09:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.776848 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.776888 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.776897 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.776912 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.776943 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:04Z","lastTransitionTime":"2025-12-02T09:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.879746 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.879786 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.879799 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.879815 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.879826 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:04Z","lastTransitionTime":"2025-12-02T09:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.982513 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.982594 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.982611 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.982634 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:04 crc kubenswrapper[4781]: I1202 09:22:04.982653 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:04Z","lastTransitionTime":"2025-12-02T09:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.086069 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.086144 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.086165 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.086189 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.086207 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:05Z","lastTransitionTime":"2025-12-02T09:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.189179 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.189252 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.189269 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.189292 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.189309 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:05Z","lastTransitionTime":"2025-12-02T09:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.292487 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.292552 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.292567 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.292588 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.292603 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:05Z","lastTransitionTime":"2025-12-02T09:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.396124 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.396177 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.396195 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.396218 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.396234 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:05Z","lastTransitionTime":"2025-12-02T09:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.498898 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:05 crc kubenswrapper[4781]: E1202 09:22:05.499172 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.499302 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.499342 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.499357 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.499383 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.499401 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:05Z","lastTransitionTime":"2025-12-02T09:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.602392 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.602452 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.602468 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.602495 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.602513 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:05Z","lastTransitionTime":"2025-12-02T09:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.705359 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.705442 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.705464 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.705495 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.705516 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:05Z","lastTransitionTime":"2025-12-02T09:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.808790 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.808863 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.808882 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.808909 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.808975 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:05Z","lastTransitionTime":"2025-12-02T09:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.912255 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.912317 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.912332 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.912356 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:05 crc kubenswrapper[4781]: I1202 09:22:05.912373 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:05Z","lastTransitionTime":"2025-12-02T09:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.016168 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.016234 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.016250 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.016276 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.016295 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:06Z","lastTransitionTime":"2025-12-02T09:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.119459 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.119544 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.119562 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.119586 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.119606 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:06Z","lastTransitionTime":"2025-12-02T09:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.223355 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.223427 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.223447 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.223472 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.223489 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:06Z","lastTransitionTime":"2025-12-02T09:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.325421 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.325476 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.325529 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.325552 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.325567 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:06Z","lastTransitionTime":"2025-12-02T09:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.429839 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.429951 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.429967 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.429990 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.430009 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:06Z","lastTransitionTime":"2025-12-02T09:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.498834 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.498905 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.499153 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:06 crc kubenswrapper[4781]: E1202 09:22:06.499287 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:06 crc kubenswrapper[4781]: E1202 09:22:06.499446 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:06 crc kubenswrapper[4781]: E1202 09:22:06.499494 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.532479 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.532527 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.532539 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.532579 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.532596 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:06Z","lastTransitionTime":"2025-12-02T09:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.635483 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.635575 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.635611 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.635641 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.635660 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:06Z","lastTransitionTime":"2025-12-02T09:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.737989 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.738052 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.738069 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.738097 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.738117 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:06Z","lastTransitionTime":"2025-12-02T09:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.840543 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.840606 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.840622 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.840685 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.840702 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:06Z","lastTransitionTime":"2025-12-02T09:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.944239 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.944346 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.944371 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.944405 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:06 crc kubenswrapper[4781]: I1202 09:22:06.944443 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:06Z","lastTransitionTime":"2025-12-02T09:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.047435 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.047500 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.047523 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.047552 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.047574 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:07Z","lastTransitionTime":"2025-12-02T09:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.150261 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.150315 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.150333 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.150799 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.150831 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:07Z","lastTransitionTime":"2025-12-02T09:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.253350 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.253417 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.253433 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.253456 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.253473 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:07Z","lastTransitionTime":"2025-12-02T09:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.356626 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.356919 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.357125 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.357342 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.357498 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:07Z","lastTransitionTime":"2025-12-02T09:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.461196 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.461266 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.461288 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.461318 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.461340 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:07Z","lastTransitionTime":"2025-12-02T09:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.499192 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:07 crc kubenswrapper[4781]: E1202 09:22:07.499415 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.558587 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7k2h5" podStartSLOduration=71.558558648 podStartE2EDuration="1m11.558558648s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:22:07.528302765 +0000 UTC m=+90.352176684" watchObservedRunningTime="2025-12-02 09:22:07.558558648 +0000 UTC m=+90.382432557" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.566238 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.566307 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.566327 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.566351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.566368 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:07Z","lastTransitionTime":"2025-12-02T09:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.579033 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=70.579012767 podStartE2EDuration="1m10.579012767s" podCreationTimestamp="2025-12-02 09:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:22:07.558680391 +0000 UTC m=+90.382554340" watchObservedRunningTime="2025-12-02 09:22:07.579012767 +0000 UTC m=+90.402886656" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.609617 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kgbn2" podStartSLOduration=71.609594568 podStartE2EDuration="1m11.609594568s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:22:07.57871942 +0000 UTC m=+90.402593309" watchObservedRunningTime="2025-12-02 09:22:07.609594568 +0000 UTC m=+90.433468487" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.668554 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.668586 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.668596 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.668610 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.668621 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:07Z","lastTransitionTime":"2025-12-02T09:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.679881 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dnkgc" podStartSLOduration=71.679865547 podStartE2EDuration="1m11.679865547s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:22:07.664907825 +0000 UTC m=+90.488781704" watchObservedRunningTime="2025-12-02 09:22:07.679865547 +0000 UTC m=+90.503739416" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.680253 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8b6p8" podStartSLOduration=71.680246767 podStartE2EDuration="1m11.680246767s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:22:07.679792225 +0000 UTC m=+90.503666124" watchObservedRunningTime="2025-12-02 09:22:07.680246767 +0000 UTC m=+90.504120646" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.703591 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=72.703577887 podStartE2EDuration="1m12.703577887s" podCreationTimestamp="2025-12-02 09:20:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:22:07.703327561 +0000 UTC m=+90.527201440" watchObservedRunningTime="2025-12-02 09:22:07.703577887 +0000 UTC m=+90.527451766" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.770363 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.770404 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.770413 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.770426 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.770436 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:07Z","lastTransitionTime":"2025-12-02T09:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.781074 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podStartSLOduration=71.781057287 podStartE2EDuration="1m11.781057287s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:22:07.780778729 +0000 UTC m=+90.604652608" watchObservedRunningTime="2025-12-02 09:22:07.781057287 +0000 UTC m=+90.604931166" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.791107 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ffk9s" podStartSLOduration=71.791087506 podStartE2EDuration="1m11.791087506s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:22:07.790632264 +0000 UTC m=+90.614506143" watchObservedRunningTime="2025-12-02 09:22:07.791087506 +0000 UTC m=+90.614961385" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.810315 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=69.810297323 podStartE2EDuration="1m9.810297323s" podCreationTimestamp="2025-12-02 09:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:22:07.809807092 +0000 UTC m=+90.633680961" watchObservedRunningTime="2025-12-02 09:22:07.810297323 +0000 UTC m=+90.634171202" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.820073 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=38.820053406 podStartE2EDuration="38.820053406s" podCreationTimestamp="2025-12-02 09:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:22:07.819836651 +0000 UTC m=+90.643710530" watchObservedRunningTime="2025-12-02 09:22:07.820053406 +0000 UTC m=+90.643927285" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.828446 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=5.828428305 podStartE2EDuration="5.828428305s" podCreationTimestamp="2025-12-02 09:22:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:22:07.828311252 +0000 UTC m=+90.652185131" watchObservedRunningTime="2025-12-02 09:22:07.828428305 +0000 UTC m=+90.652302184" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.873499 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.873570 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.873588 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.873619 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.873637 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:07Z","lastTransitionTime":"2025-12-02T09:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.976101 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.976146 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.976158 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.976176 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:07 crc kubenswrapper[4781]: I1202 09:22:07.976188 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:07Z","lastTransitionTime":"2025-12-02T09:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.078163 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.078236 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.078250 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.078272 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.078285 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:08Z","lastTransitionTime":"2025-12-02T09:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.180396 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.180434 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.180444 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.180460 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.180469 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:08Z","lastTransitionTime":"2025-12-02T09:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.282694 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.283218 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.283332 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.283400 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.283467 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:08Z","lastTransitionTime":"2025-12-02T09:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.386544 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.386592 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.386602 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.386618 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.386630 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:08Z","lastTransitionTime":"2025-12-02T09:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.489351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.489423 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.489441 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.489467 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.489489 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:08Z","lastTransitionTime":"2025-12-02T09:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.499134 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.499168 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.499153 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:08 crc kubenswrapper[4781]: E1202 09:22:08.499310 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:08 crc kubenswrapper[4781]: E1202 09:22:08.499476 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:08 crc kubenswrapper[4781]: E1202 09:22:08.499610 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.593143 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.593194 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.593205 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.593223 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.593235 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:08Z","lastTransitionTime":"2025-12-02T09:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.696426 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.696489 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.696506 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.696530 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.696548 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:08Z","lastTransitionTime":"2025-12-02T09:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.799070 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.799105 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.799116 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.799132 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.799144 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:08Z","lastTransitionTime":"2025-12-02T09:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.902278 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.902329 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.902343 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.902360 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:08 crc kubenswrapper[4781]: I1202 09:22:08.902369 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:08Z","lastTransitionTime":"2025-12-02T09:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.005172 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.005213 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.005225 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.005240 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.005250 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:09Z","lastTransitionTime":"2025-12-02T09:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.107338 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.107376 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.107386 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.107398 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.107406 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:09Z","lastTransitionTime":"2025-12-02T09:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.209169 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.209202 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.209211 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.209224 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.209232 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:09Z","lastTransitionTime":"2025-12-02T09:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.310581 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.310620 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.310630 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.310644 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.310653 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:09Z","lastTransitionTime":"2025-12-02T09:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.413492 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.413563 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.413582 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.413607 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.413625 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:09Z","lastTransitionTime":"2025-12-02T09:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.500212 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:09 crc kubenswrapper[4781]: E1202 09:22:09.500790 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.501013 4781 scope.go:117] "RemoveContainer" containerID="355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d" Dec 02 09:22:09 crc kubenswrapper[4781]: E1202 09:22:09.503045 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.516484 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.516525 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.516542 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.516563 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.516580 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:09Z","lastTransitionTime":"2025-12-02T09:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.620542 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.620638 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.620656 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.620678 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.620695 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:09Z","lastTransitionTime":"2025-12-02T09:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.724515 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.724602 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.724633 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.724670 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.724692 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:09Z","lastTransitionTime":"2025-12-02T09:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.827394 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.827459 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.827487 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.827530 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.827555 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:09Z","lastTransitionTime":"2025-12-02T09:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.931177 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.931229 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.931246 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.931268 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:09 crc kubenswrapper[4781]: I1202 09:22:09.931284 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:09Z","lastTransitionTime":"2025-12-02T09:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.034250 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.034281 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.034291 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.034307 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.034315 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:10Z","lastTransitionTime":"2025-12-02T09:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.137487 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.137537 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.137554 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.137576 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.137592 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:10Z","lastTransitionTime":"2025-12-02T09:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.240202 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.240266 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.240286 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.240310 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.240327 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:10Z","lastTransitionTime":"2025-12-02T09:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.342331 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.342391 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.342410 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.342434 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.342452 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:10Z","lastTransitionTime":"2025-12-02T09:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.446428 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.446487 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.446504 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.446528 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.446547 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:10Z","lastTransitionTime":"2025-12-02T09:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.498703 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.498748 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.498769 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:10 crc kubenswrapper[4781]: E1202 09:22:10.498836 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:10 crc kubenswrapper[4781]: E1202 09:22:10.499015 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:10 crc kubenswrapper[4781]: E1202 09:22:10.499108 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.549162 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.549229 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.549248 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.549272 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.549293 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:10Z","lastTransitionTime":"2025-12-02T09:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.652140 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.652189 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.652203 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.652221 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.652261 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:10Z","lastTransitionTime":"2025-12-02T09:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.754986 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.755019 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.755027 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.755041 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.755050 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:10Z","lastTransitionTime":"2025-12-02T09:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.857225 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.857562 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.857670 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.857784 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.857897 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:10Z","lastTransitionTime":"2025-12-02T09:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.871372 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.871445 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.871461 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.871476 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.871485 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T09:22:10Z","lastTransitionTime":"2025-12-02T09:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.909881 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb"] Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.910218 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.912551 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.912571 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.912568 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.912739 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.998802 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4021a5e7-e8dc-4148-a58e-cc077de6c9a9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-klvcb\" (UID: \"4021a5e7-e8dc-4148-a58e-cc077de6c9a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.998874 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4021a5e7-e8dc-4148-a58e-cc077de6c9a9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-klvcb\" (UID: \"4021a5e7-e8dc-4148-a58e-cc077de6c9a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.998996 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4021a5e7-e8dc-4148-a58e-cc077de6c9a9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-klvcb\" (UID: \"4021a5e7-e8dc-4148-a58e-cc077de6c9a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.999049 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4021a5e7-e8dc-4148-a58e-cc077de6c9a9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-klvcb\" (UID: \"4021a5e7-e8dc-4148-a58e-cc077de6c9a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" Dec 02 09:22:10 crc kubenswrapper[4781]: I1202 09:22:10.999116 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4021a5e7-e8dc-4148-a58e-cc077de6c9a9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-klvcb\" (UID: \"4021a5e7-e8dc-4148-a58e-cc077de6c9a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" Dec 02 09:22:11 crc kubenswrapper[4781]: I1202 09:22:11.100412 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4021a5e7-e8dc-4148-a58e-cc077de6c9a9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-klvcb\" (UID: \"4021a5e7-e8dc-4148-a58e-cc077de6c9a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" Dec 02 09:22:11 crc kubenswrapper[4781]: I1202 09:22:11.100450 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4021a5e7-e8dc-4148-a58e-cc077de6c9a9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-klvcb\" (UID: \"4021a5e7-e8dc-4148-a58e-cc077de6c9a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" Dec 02 09:22:11 crc kubenswrapper[4781]: I1202 09:22:11.100508 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4021a5e7-e8dc-4148-a58e-cc077de6c9a9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-klvcb\" (UID: \"4021a5e7-e8dc-4148-a58e-cc077de6c9a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" Dec 02 09:22:11 crc kubenswrapper[4781]: I1202 09:22:11.100530 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4021a5e7-e8dc-4148-a58e-cc077de6c9a9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-klvcb\" (UID: \"4021a5e7-e8dc-4148-a58e-cc077de6c9a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" Dec 02 09:22:11 crc kubenswrapper[4781]: I1202 09:22:11.100563 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4021a5e7-e8dc-4148-a58e-cc077de6c9a9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-klvcb\" (UID: \"4021a5e7-e8dc-4148-a58e-cc077de6c9a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" Dec 02 09:22:11 crc kubenswrapper[4781]: I1202 09:22:11.100680 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4021a5e7-e8dc-4148-a58e-cc077de6c9a9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-klvcb\" (UID: \"4021a5e7-e8dc-4148-a58e-cc077de6c9a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" Dec 02 09:22:11 crc kubenswrapper[4781]: I1202 09:22:11.101264 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4021a5e7-e8dc-4148-a58e-cc077de6c9a9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-klvcb\" (UID: \"4021a5e7-e8dc-4148-a58e-cc077de6c9a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" Dec 02 09:22:11 crc kubenswrapper[4781]: I1202 09:22:11.102322 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4021a5e7-e8dc-4148-a58e-cc077de6c9a9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-klvcb\" (UID: \"4021a5e7-e8dc-4148-a58e-cc077de6c9a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" Dec 02 09:22:11 crc kubenswrapper[4781]: I1202 09:22:11.108885 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4021a5e7-e8dc-4148-a58e-cc077de6c9a9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-klvcb\" (UID: \"4021a5e7-e8dc-4148-a58e-cc077de6c9a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" Dec 02 09:22:11 crc kubenswrapper[4781]: I1202 09:22:11.125231 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4021a5e7-e8dc-4148-a58e-cc077de6c9a9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-klvcb\" (UID: \"4021a5e7-e8dc-4148-a58e-cc077de6c9a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" Dec 02 09:22:11 crc kubenswrapper[4781]: I1202 09:22:11.224704 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" Dec 02 09:22:11 crc kubenswrapper[4781]: I1202 09:22:11.499368 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:11 crc kubenswrapper[4781]: E1202 09:22:11.499955 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:11 crc kubenswrapper[4781]: I1202 09:22:11.965498 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" event={"ID":"4021a5e7-e8dc-4148-a58e-cc077de6c9a9","Type":"ContainerStarted","Data":"b1f9638f44747ea007cf7585e89d44ff3552a88e292957a53143bcfeddb13cdc"} Dec 02 09:22:11 crc kubenswrapper[4781]: I1202 09:22:11.965552 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" event={"ID":"4021a5e7-e8dc-4148-a58e-cc077de6c9a9","Type":"ContainerStarted","Data":"1ea52a580f78bf097f2fb215a8e7ede93e6ebc2efdbd746afcc45ce07c91968f"} Dec 02 09:22:12 crc kubenswrapper[4781]: I1202 09:22:12.005447 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-klvcb" podStartSLOduration=76.005416525 podStartE2EDuration="1m16.005416525s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:22:12.004135823 +0000 UTC m=+94.828009702" watchObservedRunningTime="2025-12-02 09:22:12.005416525 +0000 UTC m=+94.829290444" Dec 02 09:22:12 crc kubenswrapper[4781]: I1202 09:22:12.498794 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:12 crc kubenswrapper[4781]: E1202 09:22:12.499186 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:12 crc kubenswrapper[4781]: I1202 09:22:12.498968 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:12 crc kubenswrapper[4781]: E1202 09:22:12.499375 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:12 crc kubenswrapper[4781]: I1202 09:22:12.498836 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:12 crc kubenswrapper[4781]: E1202 09:22:12.499567 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:13 crc kubenswrapper[4781]: I1202 09:22:13.498550 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:13 crc kubenswrapper[4781]: E1202 09:22:13.498735 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:14 crc kubenswrapper[4781]: I1202 09:22:14.333626 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs\") pod \"network-metrics-daemon-q792g\" (UID: \"bcdae8ff-3e82-4785-b958-a98717a14787\") " pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:14 crc kubenswrapper[4781]: E1202 09:22:14.333792 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 09:22:14 crc kubenswrapper[4781]: E1202 09:22:14.333901 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs podName:bcdae8ff-3e82-4785-b958-a98717a14787 nodeName:}" failed. No retries permitted until 2025-12-02 09:23:18.333883204 +0000 UTC m=+161.157757083 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs") pod "network-metrics-daemon-q792g" (UID: "bcdae8ff-3e82-4785-b958-a98717a14787") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 09:22:14 crc kubenswrapper[4781]: I1202 09:22:14.499561 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:14 crc kubenswrapper[4781]: I1202 09:22:14.499635 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:14 crc kubenswrapper[4781]: I1202 09:22:14.499710 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:14 crc kubenswrapper[4781]: E1202 09:22:14.499825 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:14 crc kubenswrapper[4781]: E1202 09:22:14.499970 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:14 crc kubenswrapper[4781]: E1202 09:22:14.500096 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:15 crc kubenswrapper[4781]: I1202 09:22:15.499527 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:15 crc kubenswrapper[4781]: E1202 09:22:15.499641 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:16 crc kubenswrapper[4781]: I1202 09:22:16.499267 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:16 crc kubenswrapper[4781]: I1202 09:22:16.499350 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:16 crc kubenswrapper[4781]: I1202 09:22:16.499296 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:16 crc kubenswrapper[4781]: E1202 09:22:16.499406 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:16 crc kubenswrapper[4781]: E1202 09:22:16.499469 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:16 crc kubenswrapper[4781]: E1202 09:22:16.499522 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:17 crc kubenswrapper[4781]: I1202 09:22:17.499268 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:17 crc kubenswrapper[4781]: E1202 09:22:17.501094 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:18 crc kubenswrapper[4781]: I1202 09:22:18.499237 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:18 crc kubenswrapper[4781]: I1202 09:22:18.499265 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:18 crc kubenswrapper[4781]: E1202 09:22:18.499424 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:18 crc kubenswrapper[4781]: I1202 09:22:18.499239 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:18 crc kubenswrapper[4781]: E1202 09:22:18.499730 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:18 crc kubenswrapper[4781]: E1202 09:22:18.499987 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:19 crc kubenswrapper[4781]: I1202 09:22:19.499118 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:19 crc kubenswrapper[4781]: E1202 09:22:19.499354 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:20 crc kubenswrapper[4781]: I1202 09:22:20.498979 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:20 crc kubenswrapper[4781]: I1202 09:22:20.499011 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:20 crc kubenswrapper[4781]: I1202 09:22:20.499018 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:20 crc kubenswrapper[4781]: E1202 09:22:20.499111 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:20 crc kubenswrapper[4781]: E1202 09:22:20.499230 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:20 crc kubenswrapper[4781]: E1202 09:22:20.499321 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:21 crc kubenswrapper[4781]: I1202 09:22:21.499367 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:21 crc kubenswrapper[4781]: E1202 09:22:21.499596 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:22 crc kubenswrapper[4781]: I1202 09:22:22.499483 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:22 crc kubenswrapper[4781]: I1202 09:22:22.499545 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:22 crc kubenswrapper[4781]: I1202 09:22:22.499491 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:22 crc kubenswrapper[4781]: E1202 09:22:22.499967 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:22 crc kubenswrapper[4781]: E1202 09:22:22.500028 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:22 crc kubenswrapper[4781]: E1202 09:22:22.500250 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:22 crc kubenswrapper[4781]: I1202 09:22:22.500288 4781 scope.go:117] "RemoveContainer" containerID="355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d" Dec 02 09:22:22 crc kubenswrapper[4781]: E1202 09:22:22.500780 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5x7g_openshift-ovn-kubernetes(20ba2af9-1f67-4b6d-884a-666ef4f55bf3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" Dec 02 09:22:23 crc kubenswrapper[4781]: I1202 09:22:23.498902 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:23 crc kubenswrapper[4781]: E1202 09:22:23.499237 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:24 crc kubenswrapper[4781]: I1202 09:22:24.499449 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:24 crc kubenswrapper[4781]: I1202 09:22:24.499516 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:24 crc kubenswrapper[4781]: I1202 09:22:24.499574 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:24 crc kubenswrapper[4781]: E1202 09:22:24.499697 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:24 crc kubenswrapper[4781]: E1202 09:22:24.499776 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:24 crc kubenswrapper[4781]: E1202 09:22:24.499847 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:25 crc kubenswrapper[4781]: I1202 09:22:25.499756 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:25 crc kubenswrapper[4781]: E1202 09:22:25.499977 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:26 crc kubenswrapper[4781]: I1202 09:22:26.499516 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:26 crc kubenswrapper[4781]: I1202 09:22:26.499538 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:26 crc kubenswrapper[4781]: I1202 09:22:26.499552 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:26 crc kubenswrapper[4781]: E1202 09:22:26.499626 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:26 crc kubenswrapper[4781]: E1202 09:22:26.499775 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:26 crc kubenswrapper[4781]: E1202 09:22:26.499904 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:27 crc kubenswrapper[4781]: I1202 09:22:27.499572 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:27 crc kubenswrapper[4781]: E1202 09:22:27.500805 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:28 crc kubenswrapper[4781]: I1202 09:22:28.499026 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:28 crc kubenswrapper[4781]: I1202 09:22:28.499112 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:28 crc kubenswrapper[4781]: E1202 09:22:28.499193 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:28 crc kubenswrapper[4781]: I1202 09:22:28.499117 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:28 crc kubenswrapper[4781]: E1202 09:22:28.499285 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:28 crc kubenswrapper[4781]: E1202 09:22:28.499482 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:29 crc kubenswrapper[4781]: I1202 09:22:29.499699 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:29 crc kubenswrapper[4781]: E1202 09:22:29.499883 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:30 crc kubenswrapper[4781]: I1202 09:22:30.027308 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8b6p8_d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650/kube-multus/1.log" Dec 02 09:22:30 crc kubenswrapper[4781]: I1202 09:22:30.027779 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8b6p8_d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650/kube-multus/0.log" Dec 02 09:22:30 crc kubenswrapper[4781]: I1202 09:22:30.027829 4781 generic.go:334] "Generic (PLEG): container finished" podID="d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650" containerID="a029cea19e689697b3e833a0bd6c745475464429e227868f4cabed7c7fe3ec71" exitCode=1 Dec 02 09:22:30 crc kubenswrapper[4781]: I1202 09:22:30.027862 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8b6p8" event={"ID":"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650","Type":"ContainerDied","Data":"a029cea19e689697b3e833a0bd6c745475464429e227868f4cabed7c7fe3ec71"} Dec 02 09:22:30 crc kubenswrapper[4781]: I1202 09:22:30.027899 4781 scope.go:117] "RemoveContainer" containerID="9bc3d340494b10e89f11eb2dfc63584a9c5660dfe367792a3d422a28dbf5a2e9" Dec 02 09:22:30 crc kubenswrapper[4781]: I1202 09:22:30.028411 4781 scope.go:117] "RemoveContainer" containerID="a029cea19e689697b3e833a0bd6c745475464429e227868f4cabed7c7fe3ec71" Dec 02 09:22:30 crc kubenswrapper[4781]: E1202 09:22:30.028687 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-8b6p8_openshift-multus(d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650)\"" pod="openshift-multus/multus-8b6p8" podUID="d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650" Dec 02 09:22:30 crc kubenswrapper[4781]: I1202 09:22:30.498630 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:30 crc kubenswrapper[4781]: I1202 09:22:30.498666 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:30 crc kubenswrapper[4781]: I1202 09:22:30.498710 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:30 crc kubenswrapper[4781]: E1202 09:22:30.498747 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:30 crc kubenswrapper[4781]: E1202 09:22:30.498874 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:30 crc kubenswrapper[4781]: E1202 09:22:30.499030 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:31 crc kubenswrapper[4781]: I1202 09:22:31.031916 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8b6p8_d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650/kube-multus/1.log" Dec 02 09:22:31 crc kubenswrapper[4781]: I1202 09:22:31.499164 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:31 crc kubenswrapper[4781]: E1202 09:22:31.499295 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:32 crc kubenswrapper[4781]: I1202 09:22:32.499419 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:32 crc kubenswrapper[4781]: I1202 09:22:32.499462 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:32 crc kubenswrapper[4781]: E1202 09:22:32.499676 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:32 crc kubenswrapper[4781]: I1202 09:22:32.499761 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:32 crc kubenswrapper[4781]: E1202 09:22:32.499979 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:32 crc kubenswrapper[4781]: E1202 09:22:32.500121 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:33 crc kubenswrapper[4781]: I1202 09:22:33.498894 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:33 crc kubenswrapper[4781]: E1202 09:22:33.499078 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:33 crc kubenswrapper[4781]: I1202 09:22:33.499910 4781 scope.go:117] "RemoveContainer" containerID="355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d" Dec 02 09:22:34 crc kubenswrapper[4781]: I1202 09:22:34.043971 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovnkube-controller/3.log" Dec 02 09:22:34 crc kubenswrapper[4781]: I1202 09:22:34.048201 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerStarted","Data":"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0"} Dec 02 09:22:34 crc kubenswrapper[4781]: I1202 09:22:34.049141 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:22:34 crc kubenswrapper[4781]: I1202 09:22:34.085901 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podStartSLOduration=98.085884756 podStartE2EDuration="1m38.085884756s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:22:34.085097986 +0000 UTC m=+116.908971885" watchObservedRunningTime="2025-12-02 09:22:34.085884756 +0000 UTC m=+116.909758645" Dec 02 09:22:34 crc kubenswrapper[4781]: I1202 09:22:34.499446 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:34 crc kubenswrapper[4781]: I1202 09:22:34.499457 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:34 crc kubenswrapper[4781]: E1202 09:22:34.499573 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:34 crc kubenswrapper[4781]: I1202 09:22:34.499457 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:34 crc kubenswrapper[4781]: E1202 09:22:34.499675 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:34 crc kubenswrapper[4781]: E1202 09:22:34.499726 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:34 crc kubenswrapper[4781]: I1202 09:22:34.630432 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q792g"] Dec 02 09:22:35 crc kubenswrapper[4781]: I1202 09:22:35.052182 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:35 crc kubenswrapper[4781]: E1202 09:22:35.052671 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:35 crc kubenswrapper[4781]: I1202 09:22:35.499326 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:35 crc kubenswrapper[4781]: E1202 09:22:35.499468 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:36 crc kubenswrapper[4781]: I1202 09:22:36.499268 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:36 crc kubenswrapper[4781]: I1202 09:22:36.499287 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:36 crc kubenswrapper[4781]: I1202 09:22:36.499294 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:36 crc kubenswrapper[4781]: E1202 09:22:36.499439 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:36 crc kubenswrapper[4781]: E1202 09:22:36.499726 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:36 crc kubenswrapper[4781]: E1202 09:22:36.499812 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:37 crc kubenswrapper[4781]: E1202 09:22:37.440272 4781 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 02 09:22:37 crc kubenswrapper[4781]: I1202 09:22:37.498626 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:37 crc kubenswrapper[4781]: E1202 09:22:37.499980 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:37 crc kubenswrapper[4781]: E1202 09:22:37.597229 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 09:22:38 crc kubenswrapper[4781]: I1202 09:22:38.499200 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:38 crc kubenswrapper[4781]: E1202 09:22:38.499377 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:38 crc kubenswrapper[4781]: I1202 09:22:38.499198 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:38 crc kubenswrapper[4781]: E1202 09:22:38.499474 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:38 crc kubenswrapper[4781]: I1202 09:22:38.499198 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:38 crc kubenswrapper[4781]: E1202 09:22:38.499533 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:39 crc kubenswrapper[4781]: I1202 09:22:39.500032 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:39 crc kubenswrapper[4781]: E1202 09:22:39.500155 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:40 crc kubenswrapper[4781]: I1202 09:22:40.498725 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:40 crc kubenswrapper[4781]: I1202 09:22:40.498768 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:40 crc kubenswrapper[4781]: E1202 09:22:40.498847 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:40 crc kubenswrapper[4781]: I1202 09:22:40.498876 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:40 crc kubenswrapper[4781]: E1202 09:22:40.499056 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:40 crc kubenswrapper[4781]: E1202 09:22:40.499158 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:41 crc kubenswrapper[4781]: I1202 09:22:41.498831 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:41 crc kubenswrapper[4781]: E1202 09:22:41.499031 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:42 crc kubenswrapper[4781]: I1202 09:22:42.499327 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:42 crc kubenswrapper[4781]: I1202 09:22:42.499463 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:42 crc kubenswrapper[4781]: E1202 09:22:42.499483 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:42 crc kubenswrapper[4781]: I1202 09:22:42.499537 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:42 crc kubenswrapper[4781]: E1202 09:22:42.499599 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:42 crc kubenswrapper[4781]: E1202 09:22:42.499685 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:42 crc kubenswrapper[4781]: E1202 09:22:42.598363 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 09:22:43 crc kubenswrapper[4781]: I1202 09:22:43.499054 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:43 crc kubenswrapper[4781]: E1202 09:22:43.499200 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:44 crc kubenswrapper[4781]: I1202 09:22:44.498811 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:44 crc kubenswrapper[4781]: I1202 09:22:44.498823 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:44 crc kubenswrapper[4781]: I1202 09:22:44.498829 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:44 crc kubenswrapper[4781]: E1202 09:22:44.498956 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:44 crc kubenswrapper[4781]: E1202 09:22:44.499063 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:44 crc kubenswrapper[4781]: E1202 09:22:44.499099 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:44 crc kubenswrapper[4781]: I1202 09:22:44.499396 4781 scope.go:117] "RemoveContainer" containerID="a029cea19e689697b3e833a0bd6c745475464429e227868f4cabed7c7fe3ec71" Dec 02 09:22:45 crc kubenswrapper[4781]: I1202 09:22:45.081083 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8b6p8_d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650/kube-multus/1.log" Dec 02 09:22:45 crc kubenswrapper[4781]: I1202 09:22:45.081616 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8b6p8" event={"ID":"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650","Type":"ContainerStarted","Data":"dae5efab5f322eb124f4544f407cadfb40f0be369a438e63b0294e811913a024"} Dec 02 09:22:45 crc kubenswrapper[4781]: I1202 09:22:45.499577 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:45 crc kubenswrapper[4781]: E1202 09:22:45.499804 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:46 crc kubenswrapper[4781]: I1202 09:22:46.498792 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:46 crc kubenswrapper[4781]: I1202 09:22:46.498827 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:46 crc kubenswrapper[4781]: I1202 09:22:46.498866 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:46 crc kubenswrapper[4781]: E1202 09:22:46.498959 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 09:22:46 crc kubenswrapper[4781]: E1202 09:22:46.499024 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q792g" podUID="bcdae8ff-3e82-4785-b958-a98717a14787" Dec 02 09:22:46 crc kubenswrapper[4781]: E1202 09:22:46.499246 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 09:22:47 crc kubenswrapper[4781]: I1202 09:22:47.499595 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:47 crc kubenswrapper[4781]: E1202 09:22:47.502833 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 09:22:48 crc kubenswrapper[4781]: I1202 09:22:48.498915 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:22:48 crc kubenswrapper[4781]: I1202 09:22:48.499003 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:22:48 crc kubenswrapper[4781]: I1202 09:22:48.499065 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:22:48 crc kubenswrapper[4781]: I1202 09:22:48.501868 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 02 09:22:48 crc kubenswrapper[4781]: I1202 09:22:48.502124 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 02 09:22:48 crc kubenswrapper[4781]: I1202 09:22:48.502156 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 02 09:22:48 crc kubenswrapper[4781]: I1202 09:22:48.502287 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 02 09:22:48 crc kubenswrapper[4781]: I1202 09:22:48.502385 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 02 09:22:48 crc kubenswrapper[4781]: I1202 09:22:48.502840 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 02 09:22:49 crc kubenswrapper[4781]: I1202 09:22:49.499177 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.703602 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.759279 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xqr9t"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.760150 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.761724 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-krhf5"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.762256 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.763600 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.763788 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.764045 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.764109 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.764376 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.765812 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zkswz"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.766348 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zkswz" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.768205 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.774104 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.774539 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.774798 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.776039 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.776088 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.776225 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.776548 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.776979 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.777143 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.777340 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.777348 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.777347 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.777422 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.777496 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.777686 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.777822 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.778213 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.778656 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.778800 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.778914 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.779563 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.779669 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-9k9bd"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.780317 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.780754 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xwf7r"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.781363 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.782580 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.783361 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-gpq68"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.783866 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-r42hh"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.784617 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-r42hh" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.785637 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.786037 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gpq68" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.789883 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6wrt7"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.790280 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gltzh"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.790494 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-n9cmc"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.790851 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.791421 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.791644 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gltzh" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.794037 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4d2fb"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.794642 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4d2fb" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.796647 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.796844 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.796971 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.797215 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.797476 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.797512 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.800031 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.809325 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.809626 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.809694 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.811286 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.812133 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wvc5b"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.814985 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.815313 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.815832 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.820457 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.820507 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.820520 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.820553 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.820470 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.820618 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.820717 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.820987 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dpqgc"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.821118 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvc5b" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.821357 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dpqgc" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.821455 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.821663 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.821741 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.821770 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.822140 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.822309 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.822410 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.822474 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.822670 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.822752 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.822855 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.822958 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.822963 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.823118 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.823283 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.824056 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qq8rf"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.824298 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.824572 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qq8rf" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.825280 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.825787 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.826008 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.827338 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.831305 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.831459 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.831537 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.831631 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.831754 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.831861 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.831989 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.832036 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.832179 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.832271 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.832293 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.834009 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.834793 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.835387 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x4lq4"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.835499 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.835611 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.835984 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.836487 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.836786 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.837592 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.837774 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p8ljp"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.838247 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p8ljp" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.838862 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.839803 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.840499 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.845198 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.846040 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.847825 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.848536 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xqkf\" (UniqueName: \"kubernetes.io/projected/d9c52f13-f9c6-419e-8f69-ee91e29f4629-kube-api-access-2xqkf\") pod \"machine-api-operator-5694c8668f-n9cmc\" (UID: \"d9c52f13-f9c6-419e-8f69-ee91e29f4629\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.848626 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a635da40-fbc4-4225-a755-dcab98f66a76-config\") pod \"machine-approver-56656f9798-7n69p\" (UID: \"a635da40-fbc4-4225-a755-dcab98f66a76\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.848855 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tkhs\" (UniqueName: \"kubernetes.io/projected/e1b68c39-405a-4419-af7e-4d9bea0189c3-kube-api-access-6tkhs\") pod \"openshift-apiserver-operator-796bbdcf4f-zkswz\" (UID: \"e1b68c39-405a-4419-af7e-4d9bea0189c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zkswz" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.848946 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a635da40-fbc4-4225-a755-dcab98f66a76-machine-approver-tls\") pod \"machine-approver-56656f9798-7n69p\" (UID: \"a635da40-fbc4-4225-a755-dcab98f66a76\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.848976 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88747619-18ec-4e3e-9a0d-1f00bc7c2038-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qq8rf\" (UID: \"88747619-18ec-4e3e-9a0d-1f00bc7c2038\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qq8rf" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.848994 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d9c52f13-f9c6-419e-8f69-ee91e29f4629-images\") pod \"machine-api-operator-5694c8668f-n9cmc\" (UID: \"d9c52f13-f9c6-419e-8f69-ee91e29f4629\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.849009 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdjht\" (UniqueName: \"kubernetes.io/projected/88747619-18ec-4e3e-9a0d-1f00bc7c2038-kube-api-access-zdjht\") pod \"openshift-controller-manager-operator-756b6f6bc6-qq8rf\" (UID: \"88747619-18ec-4e3e-9a0d-1f00bc7c2038\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qq8rf" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.849032 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1b68c39-405a-4419-af7e-4d9bea0189c3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zkswz\" (UID: \"e1b68c39-405a-4419-af7e-4d9bea0189c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zkswz" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.849052 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9c52f13-f9c6-419e-8f69-ee91e29f4629-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-n9cmc\" (UID: \"d9c52f13-f9c6-419e-8f69-ee91e29f4629\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.849073 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88ea951-0553-4510-b1cd-bd6196d1f973-service-ca-bundle\") pod \"authentication-operator-69f744f599-xwf7r\" (UID: \"d88ea951-0553-4510-b1cd-bd6196d1f973\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.849826 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b68c39-405a-4419-af7e-4d9bea0189c3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zkswz\" (UID: \"e1b68c39-405a-4419-af7e-4d9bea0189c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zkswz" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.849894 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d88ea951-0553-4510-b1cd-bd6196d1f973-serving-cert\") pod \"authentication-operator-69f744f599-xwf7r\" (UID: \"d88ea951-0553-4510-b1cd-bd6196d1f973\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.849914 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88747619-18ec-4e3e-9a0d-1f00bc7c2038-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qq8rf\" (UID: \"88747619-18ec-4e3e-9a0d-1f00bc7c2038\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qq8rf" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.849963 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc9vq\" (UniqueName: \"kubernetes.io/projected/d88ea951-0553-4510-b1cd-bd6196d1f973-kube-api-access-cc9vq\") pod \"authentication-operator-69f744f599-xwf7r\" (UID: \"d88ea951-0553-4510-b1cd-bd6196d1f973\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.849994 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a635da40-fbc4-4225-a755-dcab98f66a76-auth-proxy-config\") pod \"machine-approver-56656f9798-7n69p\" (UID: \"a635da40-fbc4-4225-a755-dcab98f66a76\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.850048 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88ea951-0553-4510-b1cd-bd6196d1f973-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xwf7r\" (UID: \"d88ea951-0553-4510-b1cd-bd6196d1f973\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.850068 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9c52f13-f9c6-419e-8f69-ee91e29f4629-config\") pod \"machine-api-operator-5694c8668f-n9cmc\" (UID: \"d9c52f13-f9c6-419e-8f69-ee91e29f4629\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.850091 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d88ea951-0553-4510-b1cd-bd6196d1f973-config\") pod \"authentication-operator-69f744f599-xwf7r\" (UID: \"d88ea951-0553-4510-b1cd-bd6196d1f973\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.853598 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.858524 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.860142 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.864384 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tr4x7"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.865069 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.865430 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.868969 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.871734 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.871757 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.872315 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.872446 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.872643 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.872822 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.872837 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.873290 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.873352 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.873396 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.873495 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.873599 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.873746 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.892491 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.892551 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.893915 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.894150 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.895249 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.896782 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zmcr4"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.897467 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zmcr4" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.897476 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.904887 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nm2f4"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.905465 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nm2f4" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.907047 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.907370 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.918455 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8k8z6"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.918906 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbdsz"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.919355 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbdsz" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.919558 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8k8z6" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.924723 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.925086 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.925184 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.926136 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.926652 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.934294 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.938242 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s95hx"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.939369 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.940531 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.943389 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-8ng22"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.943970 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.944240 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.948259 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lt8cm"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.948756 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.948820 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lt8cm" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950572 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-trusted-ca-bundle\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950599 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g2wz\" (UniqueName: \"kubernetes.io/projected/2be1529f-3b01-4174-b029-7312871f5b97-kube-api-access-2g2wz\") pod \"route-controller-manager-6576b87f9c-w2stl\" (UID: \"2be1529f-3b01-4174-b029-7312871f5b97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950617 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be701e11-18c0-4541-9685-48bd65c661f2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p8ljp\" (UID: \"be701e11-18c0-4541-9685-48bd65c661f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p8ljp" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950633 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04a6248c-9bb7-4204-a19a-1041d4d06f3e-serving-cert\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950647 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnxqs\" (UniqueName: \"kubernetes.io/projected/11d24c84-34b9-46a0-9d24-65ca291b4ac6-kube-api-access-tnxqs\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950672 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950693 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88747619-18ec-4e3e-9a0d-1f00bc7c2038-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qq8rf\" (UID: \"88747619-18ec-4e3e-9a0d-1f00bc7c2038\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qq8rf" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950709 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950725 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2be1529f-3b01-4174-b029-7312871f5b97-client-ca\") pod \"route-controller-manager-6576b87f9c-w2stl\" (UID: \"2be1529f-3b01-4174-b029-7312871f5b97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950739 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04a6248c-9bb7-4204-a19a-1041d4d06f3e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950753 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950769 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdjht\" (UniqueName: \"kubernetes.io/projected/88747619-18ec-4e3e-9a0d-1f00bc7c2038-kube-api-access-zdjht\") pod \"openshift-controller-manager-operator-756b6f6bc6-qq8rf\" (UID: \"88747619-18ec-4e3e-9a0d-1f00bc7c2038\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qq8rf" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950785 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d9c52f13-f9c6-419e-8f69-ee91e29f4629-images\") pod \"machine-api-operator-5694c8668f-n9cmc\" (UID: \"d9c52f13-f9c6-419e-8f69-ee91e29f4629\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950808 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950823 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/71324519-4199-4389-87b9-705916c2c5c4-images\") pod \"machine-config-operator-74547568cd-t6t99\" (UID: \"71324519-4199-4389-87b9-705916c2c5c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950837 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11d24c84-34b9-46a0-9d24-65ca291b4ac6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950857 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1b68c39-405a-4419-af7e-4d9bea0189c3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zkswz\" (UID: \"e1b68c39-405a-4419-af7e-4d9bea0189c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zkswz" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950871 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-console-config\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950886 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/04a6248c-9bb7-4204-a19a-1041d4d06f3e-node-pullsecrets\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950901 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6wrt7\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950915 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950933 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950966 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9c52f13-f9c6-419e-8f69-ee91e29f4629-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-n9cmc\" (UID: \"d9c52f13-f9c6-419e-8f69-ee91e29f4629\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.950998 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw9dr\" (UniqueName: \"kubernetes.io/projected/8a173fa2-3cb7-4f70-a04f-50a3131cb1ca-kube-api-access-rw9dr\") pod \"cluster-samples-operator-665b6dd947-4d2fb\" (UID: \"8a173fa2-3cb7-4f70-a04f-50a3131cb1ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4d2fb" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951014 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88ea951-0553-4510-b1cd-bd6196d1f973-service-ca-bundle\") pod \"authentication-operator-69f744f599-xwf7r\" (UID: \"d88ea951-0553-4510-b1cd-bd6196d1f973\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951031 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b68c39-405a-4419-af7e-4d9bea0189c3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zkswz\" (UID: \"e1b68c39-405a-4419-af7e-4d9bea0189c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zkswz" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951045 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/04a6248c-9bb7-4204-a19a-1041d4d06f3e-etcd-client\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951061 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/04a6248c-9bb7-4204-a19a-1041d4d06f3e-audit-dir\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951077 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/11d24c84-34b9-46a0-9d24-65ca291b4ac6-audit-policies\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951094 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z7p7\" (UniqueName: \"kubernetes.io/projected/51a8404c-874a-43e4-917a-604110280bd7-kube-api-access-6z7p7\") pod \"cluster-image-registry-operator-dc59b4c8b-68pb7\" (UID: \"51a8404c-874a-43e4-917a-604110280bd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951112 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51a8404c-874a-43e4-917a-604110280bd7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-68pb7\" (UID: \"51a8404c-874a-43e4-917a-604110280bd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951127 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cef8c4dd-6e0f-44c9-8926-72b0d821f823-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-x4lq4\" (UID: \"cef8c4dd-6e0f-44c9-8926-72b0d821f823\") " pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951141 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-client-ca\") pod \"controller-manager-879f6c89f-6wrt7\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951155 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11d24c84-34b9-46a0-9d24-65ca291b4ac6-serving-cert\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951170 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9509647-5e74-4452-a998-f0f699160a70-serving-cert\") pod \"openshift-config-operator-7777fb866f-4sb6f\" (UID: \"b9509647-5e74-4452-a998-f0f699160a70\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951184 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/04a6248c-9bb7-4204-a19a-1041d4d06f3e-etcd-serving-ca\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951200 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/04a6248c-9bb7-4204-a19a-1041d4d06f3e-encryption-config\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951213 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-audit-dir\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951228 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/caa8272b-7968-4b09-b3a7-4aa6d4f12f2b-proxy-tls\") pod \"machine-config-controller-84d6567774-l6hrl\" (UID: \"caa8272b-7968-4b09-b3a7-4aa6d4f12f2b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951251 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d88ea951-0553-4510-b1cd-bd6196d1f973-serving-cert\") pod \"authentication-operator-69f744f599-xwf7r\" (UID: \"d88ea951-0553-4510-b1cd-bd6196d1f973\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951266 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88747619-18ec-4e3e-9a0d-1f00bc7c2038-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qq8rf\" (UID: \"88747619-18ec-4e3e-9a0d-1f00bc7c2038\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qq8rf" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951281 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51a8404c-874a-43e4-917a-604110280bd7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-68pb7\" (UID: \"51a8404c-874a-43e4-917a-604110280bd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951296 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvp7g\" (UniqueName: \"kubernetes.io/projected/b9509647-5e74-4452-a998-f0f699160a70-kube-api-access-nvp7g\") pod \"openshift-config-operator-7777fb866f-4sb6f\" (UID: \"b9509647-5e74-4452-a998-f0f699160a70\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951314 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xjvd\" (UniqueName: \"kubernetes.io/projected/ee42da42-30b7-40ef-b00c-4d61e25502e0-kube-api-access-4xjvd\") pod \"dns-operator-744455d44c-r42hh\" (UID: \"ee42da42-30b7-40ef-b00c-4d61e25502e0\") " pod="openshift-dns-operator/dns-operator-744455d44c-r42hh" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951330 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951346 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/11d24c84-34b9-46a0-9d24-65ca291b4ac6-etcd-client\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951362 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83b6450c-e497-4d31-9c9a-e36fbe675a2e-config\") pod \"service-ca-operator-777779d784-dpqgc\" (UID: \"83b6450c-e497-4d31-9c9a-e36fbe675a2e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dpqgc" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951376 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951392 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc9vq\" (UniqueName: \"kubernetes.io/projected/d88ea951-0553-4510-b1cd-bd6196d1f973-kube-api-access-cc9vq\") pod \"authentication-operator-69f744f599-xwf7r\" (UID: \"d88ea951-0553-4510-b1cd-bd6196d1f973\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951407 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq25h\" (UniqueName: \"kubernetes.io/projected/83b6450c-e497-4d31-9c9a-e36fbe675a2e-kube-api-access-tq25h\") pod \"service-ca-operator-777779d784-dpqgc\" (UID: \"83b6450c-e497-4d31-9c9a-e36fbe675a2e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dpqgc" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951421 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be1529f-3b01-4174-b029-7312871f5b97-config\") pod \"route-controller-manager-6576b87f9c-w2stl\" (UID: \"2be1529f-3b01-4174-b029-7312871f5b97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951436 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4297bd58-caf6-4962-b775-7f454787fa91-serving-cert\") pod \"controller-manager-879f6c89f-6wrt7\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951476 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951491 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/caa8272b-7968-4b09-b3a7-4aa6d4f12f2b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l6hrl\" (UID: \"caa8272b-7968-4b09-b3a7-4aa6d4f12f2b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951509 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f87709bf-590d-4318-abdb-7ecb8a86e303-console-oauth-config\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951531 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a173fa2-3cb7-4f70-a04f-50a3131cb1ca-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4d2fb\" (UID: \"8a173fa2-3cb7-4f70-a04f-50a3131cb1ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4d2fb" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.951574 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22pgh\" (UniqueName: \"kubernetes.io/projected/cef8c4dd-6e0f-44c9-8926-72b0d821f823-kube-api-access-22pgh\") pod \"marketplace-operator-79b997595-x4lq4\" (UID: \"cef8c4dd-6e0f-44c9-8926-72b0d821f823\") " pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952192 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99jq5\" (UniqueName: \"kubernetes.io/projected/36103c10-f353-4b13-8c7a-eeee5e3b4f44-kube-api-access-99jq5\") pod \"console-operator-58897d9998-gltzh\" (UID: \"36103c10-f353-4b13-8c7a-eeee5e3b4f44\") " pod="openshift-console-operator/console-operator-58897d9998-gltzh" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952236 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a635da40-fbc4-4225-a755-dcab98f66a76-auth-proxy-config\") pod \"machine-approver-56656f9798-7n69p\" (UID: \"a635da40-fbc4-4225-a755-dcab98f66a76\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952289 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b9509647-5e74-4452-a998-f0f699160a70-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4sb6f\" (UID: \"b9509647-5e74-4452-a998-f0f699160a70\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952317 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f87709bf-590d-4318-abdb-7ecb8a86e303-console-serving-cert\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952363 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-service-ca\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952383 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee42da42-30b7-40ef-b00c-4d61e25502e0-metrics-tls\") pod \"dns-operator-744455d44c-r42hh\" (UID: \"ee42da42-30b7-40ef-b00c-4d61e25502e0\") " pod="openshift-dns-operator/dns-operator-744455d44c-r42hh" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952399 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be701e11-18c0-4541-9685-48bd65c661f2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p8ljp\" (UID: \"be701e11-18c0-4541-9685-48bd65c661f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p8ljp" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952417 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952435 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qntp6\" (UniqueName: \"kubernetes.io/projected/caa8272b-7968-4b09-b3a7-4aa6d4f12f2b-kube-api-access-qntp6\") pod \"machine-config-controller-84d6567774-l6hrl\" (UID: \"caa8272b-7968-4b09-b3a7-4aa6d4f12f2b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952483 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25hkf\" (UniqueName: \"kubernetes.io/projected/b43f11b4-7580-463e-9543-2d38de346fe8-kube-api-access-25hkf\") pod \"downloads-7954f5f757-gpq68\" (UID: \"b43f11b4-7580-463e-9543-2d38de346fe8\") " pod="openshift-console/downloads-7954f5f757-gpq68" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952506 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7b47\" (UniqueName: \"kubernetes.io/projected/a635da40-fbc4-4225-a755-dcab98f66a76-kube-api-access-m7b47\") pod \"machine-approver-56656f9798-7n69p\" (UID: \"a635da40-fbc4-4225-a755-dcab98f66a76\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952522 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/51a8404c-874a-43e4-917a-604110280bd7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-68pb7\" (UID: \"51a8404c-874a-43e4-917a-604110280bd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952537 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvnlc\" (UniqueName: \"kubernetes.io/projected/f87709bf-590d-4318-abdb-7ecb8a86e303-kube-api-access-hvnlc\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952552 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn2p7\" (UniqueName: \"kubernetes.io/projected/be701e11-18c0-4541-9685-48bd65c661f2-kube-api-access-xn2p7\") pod \"kube-storage-version-migrator-operator-b67b599dd-p8ljp\" (UID: \"be701e11-18c0-4541-9685-48bd65c661f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p8ljp" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952583 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1b68c39-405a-4419-af7e-4d9bea0189c3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zkswz\" (UID: \"e1b68c39-405a-4419-af7e-4d9bea0189c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zkswz" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952598 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88ea951-0553-4510-b1cd-bd6196d1f973-service-ca-bundle\") pod \"authentication-operator-69f744f599-xwf7r\" (UID: \"d88ea951-0553-4510-b1cd-bd6196d1f973\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952688 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04a6248c-9bb7-4204-a19a-1041d4d06f3e-config\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952719 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-config\") pod \"controller-manager-879f6c89f-6wrt7\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952753 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88ea951-0553-4510-b1cd-bd6196d1f973-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xwf7r\" (UID: \"d88ea951-0553-4510-b1cd-bd6196d1f973\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952780 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w99w4\" (UniqueName: \"kubernetes.io/projected/04a6248c-9bb7-4204-a19a-1041d4d06f3e-kube-api-access-w99w4\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952795 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d9c52f13-f9c6-419e-8f69-ee91e29f4629-images\") pod \"machine-api-operator-5694c8668f-n9cmc\" (UID: \"d9c52f13-f9c6-419e-8f69-ee91e29f4629\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952877 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnw8t\" (UniqueName: \"kubernetes.io/projected/4297bd58-caf6-4962-b775-7f454787fa91-kube-api-access-xnw8t\") pod \"controller-manager-879f6c89f-6wrt7\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952906 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km99d\" (UniqueName: \"kubernetes.io/projected/69ddef2d-4afd-4668-b95f-29137b133855-kube-api-access-km99d\") pod \"migrator-59844c95c7-wvc5b\" (UID: \"69ddef2d-4afd-4668-b95f-29137b133855\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvc5b" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.952951 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/11d24c84-34b9-46a0-9d24-65ca291b4ac6-encryption-config\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953008 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9c52f13-f9c6-419e-8f69-ee91e29f4629-config\") pod \"machine-api-operator-5694c8668f-n9cmc\" (UID: \"d9c52f13-f9c6-419e-8f69-ee91e29f4629\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953307 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a635da40-fbc4-4225-a755-dcab98f66a76-auth-proxy-config\") pod \"machine-approver-56656f9798-7n69p\" (UID: \"a635da40-fbc4-4225-a755-dcab98f66a76\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953373 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cef8c4dd-6e0f-44c9-8926-72b0d821f823-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-x4lq4\" (UID: \"cef8c4dd-6e0f-44c9-8926-72b0d821f823\") " pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953395 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be1529f-3b01-4174-b029-7312871f5b97-serving-cert\") pod \"route-controller-manager-6576b87f9c-w2stl\" (UID: \"2be1529f-3b01-4174-b029-7312871f5b97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953456 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d88ea951-0553-4510-b1cd-bd6196d1f973-config\") pod \"authentication-operator-69f744f599-xwf7r\" (UID: \"d88ea951-0553-4510-b1cd-bd6196d1f973\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953504 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-oauth-serving-cert\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953528 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c0570ff6-6102-40ae-a68a-b35b77756097-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8nxjr\" (UID: \"c0570ff6-6102-40ae-a68a-b35b77756097\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953554 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953571 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36103c10-f353-4b13-8c7a-eeee5e3b4f44-config\") pod \"console-operator-58897d9998-gltzh\" (UID: \"36103c10-f353-4b13-8c7a-eeee5e3b4f44\") " pod="openshift-console-operator/console-operator-58897d9998-gltzh" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953571 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9c52f13-f9c6-419e-8f69-ee91e29f4629-config\") pod \"machine-api-operator-5694c8668f-n9cmc\" (UID: \"d9c52f13-f9c6-419e-8f69-ee91e29f4629\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953692 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c0570ff6-6102-40ae-a68a-b35b77756097-srv-cert\") pod \"olm-operator-6b444d44fb-8nxjr\" (UID: \"c0570ff6-6102-40ae-a68a-b35b77756097\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953710 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/04a6248c-9bb7-4204-a19a-1041d4d06f3e-audit\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953727 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/71324519-4199-4389-87b9-705916c2c5c4-auth-proxy-config\") pod \"machine-config-operator-74547568cd-t6t99\" (UID: \"71324519-4199-4389-87b9-705916c2c5c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953775 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36103c10-f353-4b13-8c7a-eeee5e3b4f44-serving-cert\") pod \"console-operator-58897d9998-gltzh\" (UID: \"36103c10-f353-4b13-8c7a-eeee5e3b4f44\") " pod="openshift-console-operator/console-operator-58897d9998-gltzh" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953791 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36103c10-f353-4b13-8c7a-eeee5e3b4f44-trusted-ca\") pod \"console-operator-58897d9998-gltzh\" (UID: \"36103c10-f353-4b13-8c7a-eeee5e3b4f44\") " pod="openshift-console-operator/console-operator-58897d9998-gltzh" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953813 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xqkf\" (UniqueName: \"kubernetes.io/projected/d9c52f13-f9c6-419e-8f69-ee91e29f4629-kube-api-access-2xqkf\") pod \"machine-api-operator-5694c8668f-n9cmc\" (UID: \"d9c52f13-f9c6-419e-8f69-ee91e29f4629\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953845 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a635da40-fbc4-4225-a755-dcab98f66a76-config\") pod \"machine-approver-56656f9798-7n69p\" (UID: \"a635da40-fbc4-4225-a755-dcab98f66a76\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953890 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83b6450c-e497-4d31-9c9a-e36fbe675a2e-serving-cert\") pod \"service-ca-operator-777779d784-dpqgc\" (UID: \"83b6450c-e497-4d31-9c9a-e36fbe675a2e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dpqgc" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953929 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvn2c\" (UniqueName: \"kubernetes.io/projected/c0570ff6-6102-40ae-a68a-b35b77756097-kube-api-access-zvn2c\") pod \"olm-operator-6b444d44fb-8nxjr\" (UID: \"c0570ff6-6102-40ae-a68a-b35b77756097\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953945 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-audit-policies\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953962 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dvc9\" (UniqueName: \"kubernetes.io/projected/71324519-4199-4389-87b9-705916c2c5c4-kube-api-access-7dvc9\") pod \"machine-config-operator-74547568cd-t6t99\" (UID: \"71324519-4199-4389-87b9-705916c2c5c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.953998 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tkhs\" (UniqueName: \"kubernetes.io/projected/e1b68c39-405a-4419-af7e-4d9bea0189c3-kube-api-access-6tkhs\") pod \"openshift-apiserver-operator-796bbdcf4f-zkswz\" (UID: \"e1b68c39-405a-4419-af7e-4d9bea0189c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zkswz" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.954022 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a635da40-fbc4-4225-a755-dcab98f66a76-machine-approver-tls\") pod \"machine-approver-56656f9798-7n69p\" (UID: \"a635da40-fbc4-4225-a755-dcab98f66a76\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.954039 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/04a6248c-9bb7-4204-a19a-1041d4d06f3e-image-import-ca\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.954072 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djmn4\" (UniqueName: \"kubernetes.io/projected/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-kube-api-access-djmn4\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.954091 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71324519-4199-4389-87b9-705916c2c5c4-proxy-tls\") pod \"machine-config-operator-74547568cd-t6t99\" (UID: \"71324519-4199-4389-87b9-705916c2c5c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.954169 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/11d24c84-34b9-46a0-9d24-65ca291b4ac6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.954198 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/11d24c84-34b9-46a0-9d24-65ca291b4ac6-audit-dir\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.954245 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d88ea951-0553-4510-b1cd-bd6196d1f973-config\") pod \"authentication-operator-69f744f599-xwf7r\" (UID: \"d88ea951-0553-4510-b1cd-bd6196d1f973\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.955116 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sjskm"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.955784 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-sjskm" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.956192 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88747619-18ec-4e3e-9a0d-1f00bc7c2038-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qq8rf\" (UID: \"88747619-18ec-4e3e-9a0d-1f00bc7c2038\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qq8rf" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.958357 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d88ea951-0553-4510-b1cd-bd6196d1f973-serving-cert\") pod \"authentication-operator-69f744f599-xwf7r\" (UID: \"d88ea951-0553-4510-b1cd-bd6196d1f973\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.958439 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9c52f13-f9c6-419e-8f69-ee91e29f4629-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-n9cmc\" (UID: \"d9c52f13-f9c6-419e-8f69-ee91e29f4629\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.959007 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n6m8q"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.959589 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n6m8q" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.960221 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.960713 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.963338 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b68c39-405a-4419-af7e-4d9bea0189c3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zkswz\" (UID: \"e1b68c39-405a-4419-af7e-4d9bea0189c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zkswz" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.963378 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88747619-18ec-4e3e-9a0d-1f00bc7c2038-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qq8rf\" (UID: \"88747619-18ec-4e3e-9a0d-1f00bc7c2038\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qq8rf" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.963911 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88ea951-0553-4510-b1cd-bd6196d1f973-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xwf7r\" (UID: \"d88ea951-0553-4510-b1cd-bd6196d1f973\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.963979 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6c7fp"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.964923 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xqr9t"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.965004 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.967206 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xwf7r"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.967250 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zkswz"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.967459 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gpq68"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.969282 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.970275 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nm2f4"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.974753 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.975265 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.977666 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.980736 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.981671 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x4lq4"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.982743 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gltzh"] Dec 02 09:22:51 crc kubenswrapper[4781]: I1202 09:22:51.996611 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:51.999988 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.000029 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-krhf5"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.001239 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6wrt7"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.009173 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a635da40-fbc4-4225-a755-dcab98f66a76-machine-approver-tls\") pod \"machine-approver-56656f9798-7n69p\" (UID: \"a635da40-fbc4-4225-a755-dcab98f66a76\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.009404 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.009670 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.009716 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dpqgc"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.010228 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a635da40-fbc4-4225-a755-dcab98f66a76-config\") pod \"machine-approver-56656f9798-7n69p\" (UID: \"a635da40-fbc4-4225-a755-dcab98f66a76\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.012098 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-r42hh"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.013830 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zmcr4"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.015442 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-n9cmc"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.016263 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.019673 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.020697 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4d2fb"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.023332 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.024275 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbdsz"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.025892 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qq8rf"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.025923 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.027416 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wvgpb"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.028168 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wvgpb" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.028438 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lhfvc"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.029930 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8k8z6"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.030013 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lhfvc" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.033091 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wvc5b"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.034341 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tr4x7"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.036590 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-9k9bd"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.037850 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sjskm"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.039498 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p8ljp"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.040802 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.041723 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.044106 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.044748 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n6m8q"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.045960 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lt8cm"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.046979 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6c7fp"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.047740 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-vvqhq"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.048289 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vvqhq" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.048790 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s95hx"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.049794 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wvgpb"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.050782 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lhfvc"] Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055118 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cef8c4dd-6e0f-44c9-8926-72b0d821f823-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-x4lq4\" (UID: \"cef8c4dd-6e0f-44c9-8926-72b0d821f823\") " pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055144 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be1529f-3b01-4174-b029-7312871f5b97-serving-cert\") pod \"route-controller-manager-6576b87f9c-w2stl\" (UID: \"2be1529f-3b01-4174-b029-7312871f5b97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055169 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-oauth-serving-cert\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055184 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c0570ff6-6102-40ae-a68a-b35b77756097-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8nxjr\" (UID: \"c0570ff6-6102-40ae-a68a-b35b77756097\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055200 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055217 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36103c10-f353-4b13-8c7a-eeee5e3b4f44-config\") pod \"console-operator-58897d9998-gltzh\" (UID: \"36103c10-f353-4b13-8c7a-eeee5e3b4f44\") " pod="openshift-console-operator/console-operator-58897d9998-gltzh" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055232 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c0570ff6-6102-40ae-a68a-b35b77756097-srv-cert\") pod \"olm-operator-6b444d44fb-8nxjr\" (UID: \"c0570ff6-6102-40ae-a68a-b35b77756097\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055247 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/04a6248c-9bb7-4204-a19a-1041d4d06f3e-audit\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055263 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/71324519-4199-4389-87b9-705916c2c5c4-auth-proxy-config\") pod \"machine-config-operator-74547568cd-t6t99\" (UID: \"71324519-4199-4389-87b9-705916c2c5c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055280 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36103c10-f353-4b13-8c7a-eeee5e3b4f44-serving-cert\") pod \"console-operator-58897d9998-gltzh\" (UID: \"36103c10-f353-4b13-8c7a-eeee5e3b4f44\") " pod="openshift-console-operator/console-operator-58897d9998-gltzh" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055294 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36103c10-f353-4b13-8c7a-eeee5e3b4f44-trusted-ca\") pod \"console-operator-58897d9998-gltzh\" (UID: \"36103c10-f353-4b13-8c7a-eeee5e3b4f44\") " pod="openshift-console-operator/console-operator-58897d9998-gltzh" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055314 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83b6450c-e497-4d31-9c9a-e36fbe675a2e-serving-cert\") pod \"service-ca-operator-777779d784-dpqgc\" (UID: \"83b6450c-e497-4d31-9c9a-e36fbe675a2e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dpqgc" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055332 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvn2c\" (UniqueName: \"kubernetes.io/projected/c0570ff6-6102-40ae-a68a-b35b77756097-kube-api-access-zvn2c\") pod \"olm-operator-6b444d44fb-8nxjr\" (UID: \"c0570ff6-6102-40ae-a68a-b35b77756097\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055346 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-audit-policies\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055361 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dvc9\" (UniqueName: \"kubernetes.io/projected/71324519-4199-4389-87b9-705916c2c5c4-kube-api-access-7dvc9\") pod \"machine-config-operator-74547568cd-t6t99\" (UID: \"71324519-4199-4389-87b9-705916c2c5c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055378 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/04a6248c-9bb7-4204-a19a-1041d4d06f3e-image-import-ca\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055392 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djmn4\" (UniqueName: \"kubernetes.io/projected/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-kube-api-access-djmn4\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055409 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71324519-4199-4389-87b9-705916c2c5c4-proxy-tls\") pod \"machine-config-operator-74547568cd-t6t99\" (UID: \"71324519-4199-4389-87b9-705916c2c5c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055425 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/11d24c84-34b9-46a0-9d24-65ca291b4ac6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055441 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/11d24c84-34b9-46a0-9d24-65ca291b4ac6-audit-dir\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055456 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-trusted-ca-bundle\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055469 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g2wz\" (UniqueName: \"kubernetes.io/projected/2be1529f-3b01-4174-b029-7312871f5b97-kube-api-access-2g2wz\") pod \"route-controller-manager-6576b87f9c-w2stl\" (UID: \"2be1529f-3b01-4174-b029-7312871f5b97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055484 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be701e11-18c0-4541-9685-48bd65c661f2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p8ljp\" (UID: \"be701e11-18c0-4541-9685-48bd65c661f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p8ljp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055499 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04a6248c-9bb7-4204-a19a-1041d4d06f3e-serving-cert\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055514 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnxqs\" (UniqueName: \"kubernetes.io/projected/11d24c84-34b9-46a0-9d24-65ca291b4ac6-kube-api-access-tnxqs\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055538 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055557 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055572 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2be1529f-3b01-4174-b029-7312871f5b97-client-ca\") pod \"route-controller-manager-6576b87f9c-w2stl\" (UID: \"2be1529f-3b01-4174-b029-7312871f5b97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055589 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04a6248c-9bb7-4204-a19a-1041d4d06f3e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055604 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055625 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055639 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/71324519-4199-4389-87b9-705916c2c5c4-images\") pod \"machine-config-operator-74547568cd-t6t99\" (UID: \"71324519-4199-4389-87b9-705916c2c5c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055654 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11d24c84-34b9-46a0-9d24-65ca291b4ac6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055678 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-console-config\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055693 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/04a6248c-9bb7-4204-a19a-1041d4d06f3e-node-pullsecrets\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055709 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6wrt7\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055726 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055744 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055782 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw9dr\" (UniqueName: \"kubernetes.io/projected/8a173fa2-3cb7-4f70-a04f-50a3131cb1ca-kube-api-access-rw9dr\") pod \"cluster-samples-operator-665b6dd947-4d2fb\" (UID: \"8a173fa2-3cb7-4f70-a04f-50a3131cb1ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4d2fb" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055799 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/04a6248c-9bb7-4204-a19a-1041d4d06f3e-etcd-client\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055813 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/04a6248c-9bb7-4204-a19a-1041d4d06f3e-audit-dir\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055828 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/11d24c84-34b9-46a0-9d24-65ca291b4ac6-audit-policies\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055844 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z7p7\" (UniqueName: \"kubernetes.io/projected/51a8404c-874a-43e4-917a-604110280bd7-kube-api-access-6z7p7\") pod \"cluster-image-registry-operator-dc59b4c8b-68pb7\" (UID: \"51a8404c-874a-43e4-917a-604110280bd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055859 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51a8404c-874a-43e4-917a-604110280bd7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-68pb7\" (UID: \"51a8404c-874a-43e4-917a-604110280bd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055874 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cef8c4dd-6e0f-44c9-8926-72b0d821f823-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-x4lq4\" (UID: \"cef8c4dd-6e0f-44c9-8926-72b0d821f823\") " pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055889 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-client-ca\") pod \"controller-manager-879f6c89f-6wrt7\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055903 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11d24c84-34b9-46a0-9d24-65ca291b4ac6-serving-cert\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055918 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9509647-5e74-4452-a998-f0f699160a70-serving-cert\") pod \"openshift-config-operator-7777fb866f-4sb6f\" (UID: \"b9509647-5e74-4452-a998-f0f699160a70\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055932 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/04a6248c-9bb7-4204-a19a-1041d4d06f3e-etcd-serving-ca\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055965 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/04a6248c-9bb7-4204-a19a-1041d4d06f3e-encryption-config\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055980 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-audit-dir\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.055995 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/caa8272b-7968-4b09-b3a7-4aa6d4f12f2b-proxy-tls\") pod \"machine-config-controller-84d6567774-l6hrl\" (UID: \"caa8272b-7968-4b09-b3a7-4aa6d4f12f2b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056017 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51a8404c-874a-43e4-917a-604110280bd7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-68pb7\" (UID: \"51a8404c-874a-43e4-917a-604110280bd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056032 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvp7g\" (UniqueName: \"kubernetes.io/projected/b9509647-5e74-4452-a998-f0f699160a70-kube-api-access-nvp7g\") pod \"openshift-config-operator-7777fb866f-4sb6f\" (UID: \"b9509647-5e74-4452-a998-f0f699160a70\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056051 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xjvd\" (UniqueName: \"kubernetes.io/projected/ee42da42-30b7-40ef-b00c-4d61e25502e0-kube-api-access-4xjvd\") pod \"dns-operator-744455d44c-r42hh\" (UID: \"ee42da42-30b7-40ef-b00c-4d61e25502e0\") " pod="openshift-dns-operator/dns-operator-744455d44c-r42hh" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056068 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056082 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/11d24c84-34b9-46a0-9d24-65ca291b4ac6-etcd-client\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056096 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83b6450c-e497-4d31-9c9a-e36fbe675a2e-config\") pod \"service-ca-operator-777779d784-dpqgc\" (UID: \"83b6450c-e497-4d31-9c9a-e36fbe675a2e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dpqgc" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056095 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-oauth-serving-cert\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056111 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056143 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36103c10-f353-4b13-8c7a-eeee5e3b4f44-config\") pod \"console-operator-58897d9998-gltzh\" (UID: \"36103c10-f353-4b13-8c7a-eeee5e3b4f44\") " pod="openshift-console-operator/console-operator-58897d9998-gltzh" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056180 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq25h\" (UniqueName: \"kubernetes.io/projected/83b6450c-e497-4d31-9c9a-e36fbe675a2e-kube-api-access-tq25h\") pod \"service-ca-operator-777779d784-dpqgc\" (UID: \"83b6450c-e497-4d31-9c9a-e36fbe675a2e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dpqgc" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056201 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be1529f-3b01-4174-b029-7312871f5b97-config\") pod \"route-controller-manager-6576b87f9c-w2stl\" (UID: \"2be1529f-3b01-4174-b029-7312871f5b97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056218 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4297bd58-caf6-4962-b775-7f454787fa91-serving-cert\") pod \"controller-manager-879f6c89f-6wrt7\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056234 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056256 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/caa8272b-7968-4b09-b3a7-4aa6d4f12f2b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l6hrl\" (UID: \"caa8272b-7968-4b09-b3a7-4aa6d4f12f2b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056279 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f87709bf-590d-4318-abdb-7ecb8a86e303-console-oauth-config\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056316 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a173fa2-3cb7-4f70-a04f-50a3131cb1ca-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4d2fb\" (UID: \"8a173fa2-3cb7-4f70-a04f-50a3131cb1ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4d2fb" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056338 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22pgh\" (UniqueName: \"kubernetes.io/projected/cef8c4dd-6e0f-44c9-8926-72b0d821f823-kube-api-access-22pgh\") pod \"marketplace-operator-79b997595-x4lq4\" (UID: \"cef8c4dd-6e0f-44c9-8926-72b0d821f823\") " pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056362 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99jq5\" (UniqueName: \"kubernetes.io/projected/36103c10-f353-4b13-8c7a-eeee5e3b4f44-kube-api-access-99jq5\") pod \"console-operator-58897d9998-gltzh\" (UID: \"36103c10-f353-4b13-8c7a-eeee5e3b4f44\") " pod="openshift-console-operator/console-operator-58897d9998-gltzh" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056380 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b9509647-5e74-4452-a998-f0f699160a70-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4sb6f\" (UID: \"b9509647-5e74-4452-a998-f0f699160a70\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056398 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f87709bf-590d-4318-abdb-7ecb8a86e303-console-serving-cert\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056413 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-service-ca\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056427 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee42da42-30b7-40ef-b00c-4d61e25502e0-metrics-tls\") pod \"dns-operator-744455d44c-r42hh\" (UID: \"ee42da42-30b7-40ef-b00c-4d61e25502e0\") " pod="openshift-dns-operator/dns-operator-744455d44c-r42hh" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056443 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be701e11-18c0-4541-9685-48bd65c661f2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p8ljp\" (UID: \"be701e11-18c0-4541-9685-48bd65c661f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p8ljp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056461 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056483 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qntp6\" (UniqueName: \"kubernetes.io/projected/caa8272b-7968-4b09-b3a7-4aa6d4f12f2b-kube-api-access-qntp6\") pod \"machine-config-controller-84d6567774-l6hrl\" (UID: \"caa8272b-7968-4b09-b3a7-4aa6d4f12f2b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056507 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25hkf\" (UniqueName: \"kubernetes.io/projected/b43f11b4-7580-463e-9543-2d38de346fe8-kube-api-access-25hkf\") pod \"downloads-7954f5f757-gpq68\" (UID: \"b43f11b4-7580-463e-9543-2d38de346fe8\") " pod="openshift-console/downloads-7954f5f757-gpq68" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056530 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7b47\" (UniqueName: \"kubernetes.io/projected/a635da40-fbc4-4225-a755-dcab98f66a76-kube-api-access-m7b47\") pod \"machine-approver-56656f9798-7n69p\" (UID: \"a635da40-fbc4-4225-a755-dcab98f66a76\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056552 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/51a8404c-874a-43e4-917a-604110280bd7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-68pb7\" (UID: \"51a8404c-874a-43e4-917a-604110280bd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056574 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvnlc\" (UniqueName: \"kubernetes.io/projected/f87709bf-590d-4318-abdb-7ecb8a86e303-kube-api-access-hvnlc\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056595 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn2p7\" (UniqueName: \"kubernetes.io/projected/be701e11-18c0-4541-9685-48bd65c661f2-kube-api-access-xn2p7\") pod \"kube-storage-version-migrator-operator-b67b599dd-p8ljp\" (UID: \"be701e11-18c0-4541-9685-48bd65c661f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p8ljp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056622 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04a6248c-9bb7-4204-a19a-1041d4d06f3e-config\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056644 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-config\") pod \"controller-manager-879f6c89f-6wrt7\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056667 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w99w4\" (UniqueName: \"kubernetes.io/projected/04a6248c-9bb7-4204-a19a-1041d4d06f3e-kube-api-access-w99w4\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056688 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnw8t\" (UniqueName: \"kubernetes.io/projected/4297bd58-caf6-4962-b775-7f454787fa91-kube-api-access-xnw8t\") pod \"controller-manager-879f6c89f-6wrt7\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056711 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km99d\" (UniqueName: \"kubernetes.io/projected/69ddef2d-4afd-4668-b95f-29137b133855-kube-api-access-km99d\") pod \"migrator-59844c95c7-wvc5b\" (UID: \"69ddef2d-4afd-4668-b95f-29137b133855\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvc5b" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056733 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/11d24c84-34b9-46a0-9d24-65ca291b4ac6-encryption-config\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056753 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/71324519-4199-4389-87b9-705916c2c5c4-auth-proxy-config\") pod \"machine-config-operator-74547568cd-t6t99\" (UID: \"71324519-4199-4389-87b9-705916c2c5c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.056731 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-audit-policies\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.057551 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.057650 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36103c10-f353-4b13-8c7a-eeee5e3b4f44-trusted-ca\") pod \"console-operator-58897d9998-gltzh\" (UID: \"36103c10-f353-4b13-8c7a-eeee5e3b4f44\") " pod="openshift-console-operator/console-operator-58897d9998-gltzh" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.057746 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be1529f-3b01-4174-b029-7312871f5b97-config\") pod \"route-controller-manager-6576b87f9c-w2stl\" (UID: \"2be1529f-3b01-4174-b029-7312871f5b97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.057889 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be1529f-3b01-4174-b029-7312871f5b97-serving-cert\") pod \"route-controller-manager-6576b87f9c-w2stl\" (UID: \"2be1529f-3b01-4174-b029-7312871f5b97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.058332 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/04a6248c-9bb7-4204-a19a-1041d4d06f3e-audit\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.058731 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.058736 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.059564 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.059874 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-service-ca\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.060217 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/04a6248c-9bb7-4204-a19a-1041d4d06f3e-etcd-serving-ca\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.060316 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/04a6248c-9bb7-4204-a19a-1041d4d06f3e-image-import-ca\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.060365 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83b6450c-e497-4d31-9c9a-e36fbe675a2e-config\") pod \"service-ca-operator-777779d784-dpqgc\" (UID: \"83b6450c-e497-4d31-9c9a-e36fbe675a2e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dpqgc" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.060465 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/caa8272b-7968-4b09-b3a7-4aa6d4f12f2b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l6hrl\" (UID: \"caa8272b-7968-4b09-b3a7-4aa6d4f12f2b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.060501 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51a8404c-874a-43e4-917a-604110280bd7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-68pb7\" (UID: \"51a8404c-874a-43e4-917a-604110280bd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.060533 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-audit-dir\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.060640 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2be1529f-3b01-4174-b029-7312871f5b97-client-ca\") pod \"route-controller-manager-6576b87f9c-w2stl\" (UID: \"2be1529f-3b01-4174-b029-7312871f5b97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.060694 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04a6248c-9bb7-4204-a19a-1041d4d06f3e-config\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.060716 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83b6450c-e497-4d31-9c9a-e36fbe675a2e-serving-cert\") pod \"service-ca-operator-777779d784-dpqgc\" (UID: \"83b6450c-e497-4d31-9c9a-e36fbe675a2e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dpqgc" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.060786 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b9509647-5e74-4452-a998-f0f699160a70-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4sb6f\" (UID: \"b9509647-5e74-4452-a998-f0f699160a70\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.060813 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/04a6248c-9bb7-4204-a19a-1041d4d06f3e-audit-dir\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.060845 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/11d24c84-34b9-46a0-9d24-65ca291b4ac6-audit-dir\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.061086 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/71324519-4199-4389-87b9-705916c2c5c4-images\") pod \"machine-config-operator-74547568cd-t6t99\" (UID: \"71324519-4199-4389-87b9-705916c2c5c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.061163 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.061303 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/04a6248c-9bb7-4204-a19a-1041d4d06f3e-node-pullsecrets\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.061306 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-console-config\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.061496 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.061733 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-client-ca\") pod \"controller-manager-879f6c89f-6wrt7\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.061812 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.061871 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-config\") pod \"controller-manager-879f6c89f-6wrt7\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.062504 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04a6248c-9bb7-4204-a19a-1041d4d06f3e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.062545 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-trusted-ca-bundle\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.062865 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36103c10-f353-4b13-8c7a-eeee5e3b4f44-serving-cert\") pod \"console-operator-58897d9998-gltzh\" (UID: \"36103c10-f353-4b13-8c7a-eeee5e3b4f44\") " pod="openshift-console-operator/console-operator-58897d9998-gltzh" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.063113 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.063228 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee42da42-30b7-40ef-b00c-4d61e25502e0-metrics-tls\") pod \"dns-operator-744455d44c-r42hh\" (UID: \"ee42da42-30b7-40ef-b00c-4d61e25502e0\") " pod="openshift-dns-operator/dns-operator-744455d44c-r42hh" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.063408 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/51a8404c-874a-43e4-917a-604110280bd7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-68pb7\" (UID: \"51a8404c-874a-43e4-917a-604110280bd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.063639 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/caa8272b-7968-4b09-b3a7-4aa6d4f12f2b-proxy-tls\") pod \"machine-config-controller-84d6567774-l6hrl\" (UID: \"caa8272b-7968-4b09-b3a7-4aa6d4f12f2b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.063682 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6wrt7\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.064002 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.064102 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.064247 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.064747 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/04a6248c-9bb7-4204-a19a-1041d4d06f3e-etcd-client\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.064767 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71324519-4199-4389-87b9-705916c2c5c4-proxy-tls\") pod \"machine-config-operator-74547568cd-t6t99\" (UID: \"71324519-4199-4389-87b9-705916c2c5c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.064793 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f87709bf-590d-4318-abdb-7ecb8a86e303-console-oauth-config\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.064911 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f87709bf-590d-4318-abdb-7ecb8a86e303-console-serving-cert\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.064979 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a173fa2-3cb7-4f70-a04f-50a3131cb1ca-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4d2fb\" (UID: \"8a173fa2-3cb7-4f70-a04f-50a3131cb1ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4d2fb" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.065341 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04a6248c-9bb7-4204-a19a-1041d4d06f3e-serving-cert\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.066423 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4297bd58-caf6-4962-b775-7f454787fa91-serving-cert\") pod \"controller-manager-879f6c89f-6wrt7\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.066479 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9509647-5e74-4452-a998-f0f699160a70-serving-cert\") pod \"openshift-config-operator-7777fb866f-4sb6f\" (UID: \"b9509647-5e74-4452-a998-f0f699160a70\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.067356 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.067586 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/04a6248c-9bb7-4204-a19a-1041d4d06f3e-encryption-config\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.082302 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.102925 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.109698 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c0570ff6-6102-40ae-a68a-b35b77756097-srv-cert\") pod \"olm-operator-6b444d44fb-8nxjr\" (UID: \"c0570ff6-6102-40ae-a68a-b35b77756097\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.122086 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.128603 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c0570ff6-6102-40ae-a68a-b35b77756097-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8nxjr\" (UID: \"c0570ff6-6102-40ae-a68a-b35b77756097\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.142540 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.172383 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.181982 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.182302 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cef8c4dd-6e0f-44c9-8926-72b0d821f823-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-x4lq4\" (UID: \"cef8c4dd-6e0f-44c9-8926-72b0d821f823\") " pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.202401 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.222979 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.241804 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.250013 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cef8c4dd-6e0f-44c9-8926-72b0d821f823-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-x4lq4\" (UID: \"cef8c4dd-6e0f-44c9-8926-72b0d821f823\") " pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.262151 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.282303 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.303119 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.311176 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be701e11-18c0-4541-9685-48bd65c661f2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p8ljp\" (UID: \"be701e11-18c0-4541-9685-48bd65c661f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p8ljp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.322604 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.333425 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be701e11-18c0-4541-9685-48bd65c661f2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p8ljp\" (UID: \"be701e11-18c0-4541-9685-48bd65c661f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p8ljp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.342979 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.363090 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.382334 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.391669 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/11d24c84-34b9-46a0-9d24-65ca291b4ac6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.402118 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.422244 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.442018 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.463538 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.482437 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.502696 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.522393 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.543182 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.562275 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.573512 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/11d24c84-34b9-46a0-9d24-65ca291b4ac6-etcd-client\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.582448 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.595670 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11d24c84-34b9-46a0-9d24-65ca291b4ac6-serving-cert\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.602535 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.615508 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/11d24c84-34b9-46a0-9d24-65ca291b4ac6-encryption-config\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.622754 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.632163 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11d24c84-34b9-46a0-9d24-65ca291b4ac6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.642654 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.663195 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.682539 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.691608 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/11d24c84-34b9-46a0-9d24-65ca291b4ac6-audit-policies\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.722968 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.742813 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.762031 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.784533 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.802745 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.822720 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.843754 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.863314 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.882458 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.903468 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.921011 4781 request.go:700] Waited for 1.001203505s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/configmaps?fieldSelector=metadata.name%3Dkube-controller-manager-operator-config&limit=500&resourceVersion=0 Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.922284 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.942548 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.963025 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 02 09:22:52 crc kubenswrapper[4781]: I1202 09:22:52.983258 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.009894 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.022635 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.043306 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.062545 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.082681 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.102723 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.123869 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.142344 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.163149 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.182579 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.203696 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.223404 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.242702 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.263055 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.282801 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.302338 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.323495 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.376966 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdjht\" (UniqueName: \"kubernetes.io/projected/88747619-18ec-4e3e-9a0d-1f00bc7c2038-kube-api-access-zdjht\") pod \"openshift-controller-manager-operator-756b6f6bc6-qq8rf\" (UID: \"88747619-18ec-4e3e-9a0d-1f00bc7c2038\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qq8rf" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.397874 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc9vq\" (UniqueName: \"kubernetes.io/projected/d88ea951-0553-4510-b1cd-bd6196d1f973-kube-api-access-cc9vq\") pod \"authentication-operator-69f744f599-xwf7r\" (UID: \"d88ea951-0553-4510-b1cd-bd6196d1f973\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.415544 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xqkf\" (UniqueName: \"kubernetes.io/projected/d9c52f13-f9c6-419e-8f69-ee91e29f4629-kube-api-access-2xqkf\") pod \"machine-api-operator-5694c8668f-n9cmc\" (UID: \"d9c52f13-f9c6-419e-8f69-ee91e29f4629\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.435604 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tkhs\" (UniqueName: \"kubernetes.io/projected/e1b68c39-405a-4419-af7e-4d9bea0189c3-kube-api-access-6tkhs\") pod \"openshift-apiserver-operator-796bbdcf4f-zkswz\" (UID: \"e1b68c39-405a-4419-af7e-4d9bea0189c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zkswz" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.443070 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.461669 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.481506 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.482958 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.503517 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.523273 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.543121 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.563357 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.583063 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.596966 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qq8rf" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.603228 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.626658 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.640898 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zkswz" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.642478 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.664346 4781 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.682958 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.697178 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.702610 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.717182 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-n9cmc"] Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.723328 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.743108 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.762952 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.776766 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qq8rf"] Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.784879 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 02 09:22:53 crc kubenswrapper[4781]: W1202 09:22:53.790385 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88747619_18ec_4e3e_9a0d_1f00bc7c2038.slice/crio-9f3ef0ae03b97943dbd0102c9552554cd672954fdb39afb251c5682543a970f5 WatchSource:0}: Error finding container 9f3ef0ae03b97943dbd0102c9552554cd672954fdb39afb251c5682543a970f5: Status 404 returned error can't find the container with id 9f3ef0ae03b97943dbd0102c9552554cd672954fdb39afb251c5682543a970f5 Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.802395 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.820975 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zkswz"] Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.823784 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 02 09:22:53 crc kubenswrapper[4781]: W1202 09:22:53.829484 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1b68c39_405a_4419_af7e_4d9bea0189c3.slice/crio-b65834a866c1d177494d21cd3ce767981e74738294fe2ce1ff6cb2140313f7f9 WatchSource:0}: Error finding container b65834a866c1d177494d21cd3ce767981e74738294fe2ce1ff6cb2140313f7f9: Status 404 returned error can't find the container with id b65834a866c1d177494d21cd3ce767981e74738294fe2ce1ff6cb2140313f7f9 Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.843013 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.863831 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.872923 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xwf7r"] Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.883315 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.920994 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djmn4\" (UniqueName: \"kubernetes.io/projected/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-kube-api-access-djmn4\") pod \"oauth-openshift-558db77b4-xqr9t\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.921090 4781 request.go:700] Waited for 1.864281661s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/serviceaccounts/olm-operator-serviceaccount/token Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.937688 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvn2c\" (UniqueName: \"kubernetes.io/projected/c0570ff6-6102-40ae-a68a-b35b77756097-kube-api-access-zvn2c\") pod \"olm-operator-6b444d44fb-8nxjr\" (UID: \"c0570ff6-6102-40ae-a68a-b35b77756097\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.957681 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq25h\" (UniqueName: \"kubernetes.io/projected/83b6450c-e497-4d31-9c9a-e36fbe675a2e-kube-api-access-tq25h\") pod \"service-ca-operator-777779d784-dpqgc\" (UID: \"83b6450c-e497-4d31-9c9a-e36fbe675a2e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dpqgc" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.987225 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qntp6\" (UniqueName: \"kubernetes.io/projected/caa8272b-7968-4b09-b3a7-4aa6d4f12f2b-kube-api-access-qntp6\") pod \"machine-config-controller-84d6567774-l6hrl\" (UID: \"caa8272b-7968-4b09-b3a7-4aa6d4f12f2b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl" Dec 02 09:22:53 crc kubenswrapper[4781]: I1202 09:22:53.998642 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25hkf\" (UniqueName: \"kubernetes.io/projected/b43f11b4-7580-463e-9543-2d38de346fe8-kube-api-access-25hkf\") pod \"downloads-7954f5f757-gpq68\" (UID: \"b43f11b4-7580-463e-9543-2d38de346fe8\") " pod="openshift-console/downloads-7954f5f757-gpq68" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.016872 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7b47\" (UniqueName: \"kubernetes.io/projected/a635da40-fbc4-4225-a755-dcab98f66a76-kube-api-access-m7b47\") pod \"machine-approver-56656f9798-7n69p\" (UID: \"a635da40-fbc4-4225-a755-dcab98f66a76\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.040777 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22pgh\" (UniqueName: \"kubernetes.io/projected/cef8c4dd-6e0f-44c9-8926-72b0d821f823-kube-api-access-22pgh\") pod \"marketplace-operator-79b997595-x4lq4\" (UID: \"cef8c4dd-6e0f-44c9-8926-72b0d821f823\") " pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" Dec 02 09:22:54 crc kubenswrapper[4781]: W1202 09:22:54.043769 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd88ea951_0553_4510_b1cd_bd6196d1f973.slice/crio-7df2e6ab101a31d29c8d5b89001b088d9bf100b79f70731700bf3229533ea10c WatchSource:0}: Error finding container 7df2e6ab101a31d29c8d5b89001b088d9bf100b79f70731700bf3229533ea10c: Status 404 returned error can't find the container with id 7df2e6ab101a31d29c8d5b89001b088d9bf100b79f70731700bf3229533ea10c Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.049772 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gpq68" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.059878 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dvc9\" (UniqueName: \"kubernetes.io/projected/71324519-4199-4389-87b9-705916c2c5c4-kube-api-access-7dvc9\") pod \"machine-config-operator-74547568cd-t6t99\" (UID: \"71324519-4199-4389-87b9-705916c2c5c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.091199 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvp7g\" (UniqueName: \"kubernetes.io/projected/b9509647-5e74-4452-a998-f0f699160a70-kube-api-access-nvp7g\") pod \"openshift-config-operator-7777fb866f-4sb6f\" (UID: \"b9509647-5e74-4452-a998-f0f699160a70\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.097664 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xjvd\" (UniqueName: \"kubernetes.io/projected/ee42da42-30b7-40ef-b00c-4d61e25502e0-kube-api-access-4xjvd\") pod \"dns-operator-744455d44c-r42hh\" (UID: \"ee42da42-30b7-40ef-b00c-4d61e25502e0\") " pod="openshift-dns-operator/dns-operator-744455d44c-r42hh" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.109118 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" event={"ID":"d88ea951-0553-4510-b1cd-bd6196d1f973","Type":"ContainerStarted","Data":"7df2e6ab101a31d29c8d5b89001b088d9bf100b79f70731700bf3229533ea10c"} Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.109830 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" event={"ID":"d9c52f13-f9c6-419e-8f69-ee91e29f4629","Type":"ContainerStarted","Data":"580e7ed990a4f6b960b8f3ed12119081e7469c99b7f5ba4ba78136794ffbd1e8"} Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.110601 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qq8rf" event={"ID":"88747619-18ec-4e3e-9a0d-1f00bc7c2038","Type":"ContainerStarted","Data":"9f3ef0ae03b97943dbd0102c9552554cd672954fdb39afb251c5682543a970f5"} Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.111419 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zkswz" event={"ID":"e1b68c39-405a-4419-af7e-4d9bea0189c3","Type":"ContainerStarted","Data":"b65834a866c1d177494d21cd3ce767981e74738294fe2ce1ff6cb2140313f7f9"} Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.117076 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw9dr\" (UniqueName: \"kubernetes.io/projected/8a173fa2-3cb7-4f70-a04f-50a3131cb1ca-kube-api-access-rw9dr\") pod \"cluster-samples-operator-665b6dd947-4d2fb\" (UID: \"8a173fa2-3cb7-4f70-a04f-50a3131cb1ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4d2fb" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.131604 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4d2fb" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.137419 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.141603 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w99w4\" (UniqueName: \"kubernetes.io/projected/04a6248c-9bb7-4204-a19a-1041d4d06f3e-kube-api-access-w99w4\") pod \"apiserver-76f77b778f-9k9bd\" (UID: \"04a6248c-9bb7-4204-a19a-1041d4d06f3e\") " pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.143826 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.158201 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dpqgc" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.173834 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvnlc\" (UniqueName: \"kubernetes.io/projected/f87709bf-590d-4318-abdb-7ecb8a86e303-kube-api-access-hvnlc\") pod \"console-f9d7485db-krhf5\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.191500 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnw8t\" (UniqueName: \"kubernetes.io/projected/4297bd58-caf6-4962-b775-7f454787fa91-kube-api-access-xnw8t\") pod \"controller-manager-879f6c89f-6wrt7\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.197144 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.204152 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.209487 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km99d\" (UniqueName: \"kubernetes.io/projected/69ddef2d-4afd-4668-b95f-29137b133855-kube-api-access-km99d\") pod \"migrator-59844c95c7-wvc5b\" (UID: \"69ddef2d-4afd-4668-b95f-29137b133855\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvc5b" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.217249 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.219944 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn2p7\" (UniqueName: \"kubernetes.io/projected/be701e11-18c0-4541-9685-48bd65c661f2-kube-api-access-xn2p7\") pod \"kube-storage-version-migrator-operator-b67b599dd-p8ljp\" (UID: \"be701e11-18c0-4541-9685-48bd65c661f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p8ljp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.221211 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.232362 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.234701 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.238356 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99jq5\" (UniqueName: \"kubernetes.io/projected/36103c10-f353-4b13-8c7a-eeee5e3b4f44-kube-api-access-99jq5\") pod \"console-operator-58897d9998-gltzh\" (UID: \"36103c10-f353-4b13-8c7a-eeee5e3b4f44\") " pod="openshift-console-operator/console-operator-58897d9998-gltzh" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.241093 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p8ljp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.259886 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51a8404c-874a-43e4-917a-604110280bd7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-68pb7\" (UID: \"51a8404c-874a-43e4-917a-604110280bd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.272552 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.286643 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g2wz\" (UniqueName: \"kubernetes.io/projected/2be1529f-3b01-4174-b029-7312871f5b97-kube-api-access-2g2wz\") pod \"route-controller-manager-6576b87f9c-w2stl\" (UID: \"2be1529f-3b01-4174-b029-7312871f5b97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.296908 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z7p7\" (UniqueName: \"kubernetes.io/projected/51a8404c-874a-43e4-917a-604110280bd7-kube-api-access-6z7p7\") pod \"cluster-image-registry-operator-dc59b4c8b-68pb7\" (UID: \"51a8404c-874a-43e4-917a-604110280bd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.320204 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-r42hh" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.320439 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnxqs\" (UniqueName: \"kubernetes.io/projected/11d24c84-34b9-46a0-9d24-65ca291b4ac6-kube-api-access-tnxqs\") pod \"apiserver-7bbb656c7d-cfwrp\" (UID: \"11d24c84-34b9-46a0-9d24-65ca291b4ac6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.340240 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.341760 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4d2fb"] Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.385345 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f"] Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.683667 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.684554 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.685758 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvc5b" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.686493 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.686633 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gltzh" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.692829 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99"] Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.693825 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gpq68"] Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.696589 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ba8d87f-de60-4d9c-8f9a-c6a38cd73858-config\") pod \"kube-controller-manager-operator-78b949d7b-8k8z6\" (UID: \"5ba8d87f-de60-4d9c-8f9a-c6a38cd73858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8k8z6" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.696827 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da84b89e-515f-4595-badf-a13b1ce0342a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: W1202 09:22:54.699491 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb43f11b4_7580_463e_9543_2d38de346fe8.slice/crio-64c1f381116e4c140fc28a3cc684d09e105aaf1690b7c4609abfdd6056e21fac WatchSource:0}: Error finding container 64c1f381116e4c140fc28a3cc684d09e105aaf1690b7c4609abfdd6056e21fac: Status 404 returned error can't find the container with id 64c1f381116e4c140fc28a3cc684d09e105aaf1690b7c4609abfdd6056e21fac Dec 02 09:22:54 crc kubenswrapper[4781]: W1202 09:22:54.701019 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9509647_5e74_4452_a998_f0f699160a70.slice/crio-39c88b75644cc570c371c8fb7fcf477867dd867b43eefea762f9373d0fd036b5 WatchSource:0}: Error finding container 39c88b75644cc570c371c8fb7fcf477867dd867b43eefea762f9373d0fd036b5: Status 404 returned error can't find the container with id 39c88b75644cc570c371c8fb7fcf477867dd867b43eefea762f9373d0fd036b5 Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.702624 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsr9q\" (UniqueName: \"kubernetes.io/projected/2a4f5362-7618-4d15-8be9-465e83bdb0b9-kube-api-access-tsr9q\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.702748 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c00ba40-b027-444f-aa39-f8f0cbbd13cd-trusted-ca\") pod \"ingress-operator-5b745b69d9-jnbdf\" (UID: \"9c00ba40-b027-444f-aa39-f8f0cbbd13cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.702800 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ns44\" (UniqueName: \"kubernetes.io/projected/6e5395e3-0578-419d-8297-bdcf2cc6ae01-kube-api-access-7ns44\") pod \"catalog-operator-68c6474976-wjx4q\" (UID: \"6e5395e3-0578-419d-8297-bdcf2cc6ae01\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.702862 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pc7m\" (UniqueName: \"kubernetes.io/projected/70c23dd9-b165-4f35-9230-da18a16f48be-kube-api-access-4pc7m\") pod \"collect-profiles-29411115-gzjfg\" (UID: \"70c23dd9-b165-4f35-9230-da18a16f48be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.702923 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lzst\" (UniqueName: \"kubernetes.io/projected/50d4d4c8-66e1-4d10-85cb-0fee6079d5fe-kube-api-access-9lzst\") pod \"control-plane-machine-set-operator-78cbb6b69f-nm2f4\" (UID: \"50d4d4c8-66e1-4d10-85cb-0fee6079d5fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nm2f4" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.706032 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc3224e4-715f-4f5d-99a0-5325d9b87fb8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lbdsz\" (UID: \"dc3224e4-715f-4f5d-99a0-5325d9b87fb8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbdsz" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.706248 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79f1ffd5-bf7e-4303-81c6-72cd48e9ae46-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lt8cm\" (UID: \"79f1ffd5-bf7e-4303-81c6-72cd48e9ae46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lt8cm" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.706352 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a4f5362-7618-4d15-8be9-465e83bdb0b9-etcd-service-ca\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.706449 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9c00ba40-b027-444f-aa39-f8f0cbbd13cd-metrics-tls\") pod \"ingress-operator-5b745b69d9-jnbdf\" (UID: \"9c00ba40-b027-444f-aa39-f8f0cbbd13cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.706662 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da84b89e-515f-4595-badf-a13b1ce0342a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.720580 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-bound-sa-token\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.720798 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70c23dd9-b165-4f35-9230-da18a16f48be-secret-volume\") pod \"collect-profiles-29411115-gzjfg\" (UID: \"70c23dd9-b165-4f35-9230-da18a16f48be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.729543 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vz8g\" (UniqueName: \"kubernetes.io/projected/d478d486-3666-4aae-967f-36d60f4de521-kube-api-access-2vz8g\") pod \"multus-admission-controller-857f4d67dd-zmcr4\" (UID: \"d478d486-3666-4aae-967f-36d60f4de521\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zmcr4" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.730021 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2a4f5362-7618-4d15-8be9-465e83bdb0b9-etcd-ca\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.730125 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/22ac0012-d7d1-4b53-a0e7-1ca1d09832b9-default-certificate\") pod \"router-default-5444994796-8ng22\" (UID: \"22ac0012-d7d1-4b53-a0e7-1ca1d09832b9\") " pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.730207 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ba8d87f-de60-4d9c-8f9a-c6a38cd73858-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8k8z6\" (UID: \"5ba8d87f-de60-4d9c-8f9a-c6a38cd73858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8k8z6" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.730272 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79f1ffd5-bf7e-4303-81c6-72cd48e9ae46-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lt8cm\" (UID: \"79f1ffd5-bf7e-4303-81c6-72cd48e9ae46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lt8cm" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.730772 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/22ac0012-d7d1-4b53-a0e7-1ca1d09832b9-stats-auth\") pod \"router-default-5444994796-8ng22\" (UID: \"22ac0012-d7d1-4b53-a0e7-1ca1d09832b9\") " pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.730796 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d478d486-3666-4aae-967f-36d60f4de521-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zmcr4\" (UID: \"d478d486-3666-4aae-967f-36d60f4de521\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zmcr4" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.730831 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/50d4d4c8-66e1-4d10-85cb-0fee6079d5fe-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nm2f4\" (UID: \"50d4d4c8-66e1-4d10-85cb-0fee6079d5fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nm2f4" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.730953 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.731175 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70c23dd9-b165-4f35-9230-da18a16f48be-config-volume\") pod \"collect-profiles-29411115-gzjfg\" (UID: \"70c23dd9-b165-4f35-9230-da18a16f48be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.731199 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c00ba40-b027-444f-aa39-f8f0cbbd13cd-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jnbdf\" (UID: \"9c00ba40-b027-444f-aa39-f8f0cbbd13cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.731612 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a4f5362-7618-4d15-8be9-465e83bdb0b9-serving-cert\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.731673 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da84b89e-515f-4595-badf-a13b1ce0342a-trusted-ca\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.731736 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-registry-tls\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.731843 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ba8d87f-de60-4d9c-8f9a-c6a38cd73858-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8k8z6\" (UID: \"5ba8d87f-de60-4d9c-8f9a-c6a38cd73858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8k8z6" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.731869 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22ac0012-d7d1-4b53-a0e7-1ca1d09832b9-metrics-certs\") pod \"router-default-5444994796-8ng22\" (UID: \"22ac0012-d7d1-4b53-a0e7-1ca1d09832b9\") " pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.731907 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h8fq\" (UniqueName: \"kubernetes.io/projected/22ac0012-d7d1-4b53-a0e7-1ca1d09832b9-kube-api-access-4h8fq\") pod \"router-default-5444994796-8ng22\" (UID: \"22ac0012-d7d1-4b53-a0e7-1ca1d09832b9\") " pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:22:54 crc kubenswrapper[4781]: E1202 09:22:54.734025 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:55.233870842 +0000 UTC m=+138.057744721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.734265 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a4f5362-7618-4d15-8be9-465e83bdb0b9-config\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.734729 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e5395e3-0578-419d-8297-bdcf2cc6ae01-srv-cert\") pod \"catalog-operator-68c6474976-wjx4q\" (UID: \"6e5395e3-0578-419d-8297-bdcf2cc6ae01\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.739866 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f1ffd5-bf7e-4303-81c6-72cd48e9ae46-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lt8cm\" (UID: \"79f1ffd5-bf7e-4303-81c6-72cd48e9ae46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lt8cm" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.742232 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzvnd\" (UniqueName: \"kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-kube-api-access-bzvnd\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.742261 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdcq7\" (UniqueName: \"kubernetes.io/projected/dc3224e4-715f-4f5d-99a0-5325d9b87fb8-kube-api-access-tdcq7\") pod \"package-server-manager-789f6589d5-lbdsz\" (UID: \"dc3224e4-715f-4f5d-99a0-5325d9b87fb8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbdsz" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.743020 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2a4f5362-7618-4d15-8be9-465e83bdb0b9-etcd-client\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.743061 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da84b89e-515f-4595-badf-a13b1ce0342a-registry-certificates\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.743085 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6e5395e3-0578-419d-8297-bdcf2cc6ae01-profile-collector-cert\") pod \"catalog-operator-68c6474976-wjx4q\" (UID: \"6e5395e3-0578-419d-8297-bdcf2cc6ae01\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.743106 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glwth\" (UniqueName: \"kubernetes.io/projected/9c00ba40-b027-444f-aa39-f8f0cbbd13cd-kube-api-access-glwth\") pod \"ingress-operator-5b745b69d9-jnbdf\" (UID: \"9c00ba40-b027-444f-aa39-f8f0cbbd13cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.743429 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22ac0012-d7d1-4b53-a0e7-1ca1d09832b9-service-ca-bundle\") pod \"router-default-5444994796-8ng22\" (UID: \"22ac0012-d7d1-4b53-a0e7-1ca1d09832b9\") " pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.844507 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.844759 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ba8d87f-de60-4d9c-8f9a-c6a38cd73858-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8k8z6\" (UID: \"5ba8d87f-de60-4d9c-8f9a-c6a38cd73858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8k8z6" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.844791 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22ac0012-d7d1-4b53-a0e7-1ca1d09832b9-metrics-certs\") pod \"router-default-5444994796-8ng22\" (UID: \"22ac0012-d7d1-4b53-a0e7-1ca1d09832b9\") " pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.844810 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h8fq\" (UniqueName: \"kubernetes.io/projected/22ac0012-d7d1-4b53-a0e7-1ca1d09832b9-kube-api-access-4h8fq\") pod \"router-default-5444994796-8ng22\" (UID: \"22ac0012-d7d1-4b53-a0e7-1ca1d09832b9\") " pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.844834 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c3b35e49-827a-4445-b46f-ca74f65450c5-node-bootstrap-token\") pod \"machine-config-server-vvqhq\" (UID: \"c3b35e49-827a-4445-b46f-ca74f65450c5\") " pod="openshift-machine-config-operator/machine-config-server-vvqhq" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.844855 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmk8r\" (UniqueName: \"kubernetes.io/projected/1357c532-9360-4644-969d-9cc0407a1569-kube-api-access-bmk8r\") pod \"service-ca-9c57cc56f-sjskm\" (UID: \"1357c532-9360-4644-969d-9cc0407a1569\") " pod="openshift-service-ca/service-ca-9c57cc56f-sjskm" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.844901 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d20e2eeb-837d-4299-b1ce-2044781226f0-registration-dir\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.844978 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff4d4965-cf17-4d57-8820-fec4107ba62d-apiservice-cert\") pod \"packageserver-d55dfcdfc-zlpfg\" (UID: \"ff4d4965-cf17-4d57-8820-fec4107ba62d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.845007 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a4f5362-7618-4d15-8be9-465e83bdb0b9-config\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.845029 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d20e2eeb-837d-4299-b1ce-2044781226f0-socket-dir\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.845050 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/822c4f3b-992d-4ac5-aeb5-cc70cce2b305-metrics-tls\") pod \"dns-default-lhfvc\" (UID: \"822c4f3b-992d-4ac5-aeb5-cc70cce2b305\") " pod="openshift-dns/dns-default-lhfvc" Dec 02 09:22:54 crc kubenswrapper[4781]: E1202 09:22:54.845469 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:55.345449097 +0000 UTC m=+138.169322976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.845071 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e5395e3-0578-419d-8297-bdcf2cc6ae01-srv-cert\") pod \"catalog-operator-68c6474976-wjx4q\" (UID: \"6e5395e3-0578-419d-8297-bdcf2cc6ae01\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846122 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f1ffd5-bf7e-4303-81c6-72cd48e9ae46-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lt8cm\" (UID: \"79f1ffd5-bf7e-4303-81c6-72cd48e9ae46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lt8cm" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846165 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzvnd\" (UniqueName: \"kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-kube-api-access-bzvnd\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846189 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdcq7\" (UniqueName: \"kubernetes.io/projected/dc3224e4-715f-4f5d-99a0-5325d9b87fb8-kube-api-access-tdcq7\") pod \"package-server-manager-789f6589d5-lbdsz\" (UID: \"dc3224e4-715f-4f5d-99a0-5325d9b87fb8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbdsz" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846217 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v49v\" (UniqueName: \"kubernetes.io/projected/2f1a331e-d7ef-4ab6-82e3-a5579353a37d-kube-api-access-6v49v\") pod \"ingress-canary-wvgpb\" (UID: \"2f1a331e-d7ef-4ab6-82e3-a5579353a37d\") " pod="openshift-ingress-canary/ingress-canary-wvgpb" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846248 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2a4f5362-7618-4d15-8be9-465e83bdb0b9-etcd-client\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846271 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c3b35e49-827a-4445-b46f-ca74f65450c5-certs\") pod \"machine-config-server-vvqhq\" (UID: \"c3b35e49-827a-4445-b46f-ca74f65450c5\") " pod="openshift-machine-config-operator/machine-config-server-vvqhq" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846307 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da84b89e-515f-4595-badf-a13b1ce0342a-registry-certificates\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846334 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6e5395e3-0578-419d-8297-bdcf2cc6ae01-profile-collector-cert\") pod \"catalog-operator-68c6474976-wjx4q\" (UID: \"6e5395e3-0578-419d-8297-bdcf2cc6ae01\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846358 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glwth\" (UniqueName: \"kubernetes.io/projected/9c00ba40-b027-444f-aa39-f8f0cbbd13cd-kube-api-access-glwth\") pod \"ingress-operator-5b745b69d9-jnbdf\" (UID: \"9c00ba40-b027-444f-aa39-f8f0cbbd13cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846382 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb7l5\" (UniqueName: \"kubernetes.io/projected/c3b35e49-827a-4445-b46f-ca74f65450c5-kube-api-access-wb7l5\") pod \"machine-config-server-vvqhq\" (UID: \"c3b35e49-827a-4445-b46f-ca74f65450c5\") " pod="openshift-machine-config-operator/machine-config-server-vvqhq" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846436 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ca4125e-8294-4a1c-9685-23022db2999b-config\") pod \"kube-apiserver-operator-766d6c64bb-n6m8q\" (UID: \"5ca4125e-8294-4a1c-9685-23022db2999b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n6m8q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846460 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22ac0012-d7d1-4b53-a0e7-1ca1d09832b9-service-ca-bundle\") pod \"router-default-5444994796-8ng22\" (UID: \"22ac0012-d7d1-4b53-a0e7-1ca1d09832b9\") " pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846496 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1357c532-9360-4644-969d-9cc0407a1569-signing-cabundle\") pod \"service-ca-9c57cc56f-sjskm\" (UID: \"1357c532-9360-4644-969d-9cc0407a1569\") " pod="openshift-service-ca/service-ca-9c57cc56f-sjskm" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846518 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ca4125e-8294-4a1c-9685-23022db2999b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-n6m8q\" (UID: \"5ca4125e-8294-4a1c-9685-23022db2999b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n6m8q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846597 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ba8d87f-de60-4d9c-8f9a-c6a38cd73858-config\") pod \"kube-controller-manager-operator-78b949d7b-8k8z6\" (UID: \"5ba8d87f-de60-4d9c-8f9a-c6a38cd73858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8k8z6" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846654 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da84b89e-515f-4595-badf-a13b1ce0342a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846678 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d20e2eeb-837d-4299-b1ce-2044781226f0-plugins-dir\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846714 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsr9q\" (UniqueName: \"kubernetes.io/projected/2a4f5362-7618-4d15-8be9-465e83bdb0b9-kube-api-access-tsr9q\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846760 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b488n\" (UniqueName: \"kubernetes.io/projected/822c4f3b-992d-4ac5-aeb5-cc70cce2b305-kube-api-access-b488n\") pod \"dns-default-lhfvc\" (UID: \"822c4f3b-992d-4ac5-aeb5-cc70cce2b305\") " pod="openshift-dns/dns-default-lhfvc" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846810 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c00ba40-b027-444f-aa39-f8f0cbbd13cd-trusted-ca\") pod \"ingress-operator-5b745b69d9-jnbdf\" (UID: \"9c00ba40-b027-444f-aa39-f8f0cbbd13cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846835 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pc7m\" (UniqueName: \"kubernetes.io/projected/70c23dd9-b165-4f35-9230-da18a16f48be-kube-api-access-4pc7m\") pod \"collect-profiles-29411115-gzjfg\" (UID: \"70c23dd9-b165-4f35-9230-da18a16f48be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846857 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ns44\" (UniqueName: \"kubernetes.io/projected/6e5395e3-0578-419d-8297-bdcf2cc6ae01-kube-api-access-7ns44\") pod \"catalog-operator-68c6474976-wjx4q\" (UID: \"6e5395e3-0578-419d-8297-bdcf2cc6ae01\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846883 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lzst\" (UniqueName: \"kubernetes.io/projected/50d4d4c8-66e1-4d10-85cb-0fee6079d5fe-kube-api-access-9lzst\") pod \"control-plane-machine-set-operator-78cbb6b69f-nm2f4\" (UID: \"50d4d4c8-66e1-4d10-85cb-0fee6079d5fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nm2f4" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846910 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84cgn\" (UniqueName: \"kubernetes.io/projected/ff4d4965-cf17-4d57-8820-fec4107ba62d-kube-api-access-84cgn\") pod \"packageserver-d55dfcdfc-zlpfg\" (UID: \"ff4d4965-cf17-4d57-8820-fec4107ba62d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846955 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwspl\" (UniqueName: \"kubernetes.io/projected/d20e2eeb-837d-4299-b1ce-2044781226f0-kube-api-access-nwspl\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.846988 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc3224e4-715f-4f5d-99a0-5325d9b87fb8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lbdsz\" (UID: \"dc3224e4-715f-4f5d-99a0-5325d9b87fb8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbdsz" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847013 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79f1ffd5-bf7e-4303-81c6-72cd48e9ae46-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lt8cm\" (UID: \"79f1ffd5-bf7e-4303-81c6-72cd48e9ae46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lt8cm" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847099 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a4f5362-7618-4d15-8be9-465e83bdb0b9-etcd-service-ca\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847186 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f1a331e-d7ef-4ab6-82e3-a5579353a37d-cert\") pod \"ingress-canary-wvgpb\" (UID: \"2f1a331e-d7ef-4ab6-82e3-a5579353a37d\") " pod="openshift-ingress-canary/ingress-canary-wvgpb" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847224 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d20e2eeb-837d-4299-b1ce-2044781226f0-csi-data-dir\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847250 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9c00ba40-b027-444f-aa39-f8f0cbbd13cd-metrics-tls\") pod \"ingress-operator-5b745b69d9-jnbdf\" (UID: \"9c00ba40-b027-444f-aa39-f8f0cbbd13cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847303 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/822c4f3b-992d-4ac5-aeb5-cc70cce2b305-config-volume\") pod \"dns-default-lhfvc\" (UID: \"822c4f3b-992d-4ac5-aeb5-cc70cce2b305\") " pod="openshift-dns/dns-default-lhfvc" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847353 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da84b89e-515f-4595-badf-a13b1ce0342a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847362 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da84b89e-515f-4595-badf-a13b1ce0342a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847377 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-bound-sa-token\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847398 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ff4d4965-cf17-4d57-8820-fec4107ba62d-tmpfs\") pod \"packageserver-d55dfcdfc-zlpfg\" (UID: \"ff4d4965-cf17-4d57-8820-fec4107ba62d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847427 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70c23dd9-b165-4f35-9230-da18a16f48be-secret-volume\") pod \"collect-profiles-29411115-gzjfg\" (UID: \"70c23dd9-b165-4f35-9230-da18a16f48be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847448 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1357c532-9360-4644-969d-9cc0407a1569-signing-key\") pod \"service-ca-9c57cc56f-sjskm\" (UID: \"1357c532-9360-4644-969d-9cc0407a1569\") " pod="openshift-service-ca/service-ca-9c57cc56f-sjskm" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847492 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vz8g\" (UniqueName: \"kubernetes.io/projected/d478d486-3666-4aae-967f-36d60f4de521-kube-api-access-2vz8g\") pod \"multus-admission-controller-857f4d67dd-zmcr4\" (UID: \"d478d486-3666-4aae-967f-36d60f4de521\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zmcr4" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847514 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ca4125e-8294-4a1c-9685-23022db2999b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-n6m8q\" (UID: \"5ca4125e-8294-4a1c-9685-23022db2999b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n6m8q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847551 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2a4f5362-7618-4d15-8be9-465e83bdb0b9-etcd-ca\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847577 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/22ac0012-d7d1-4b53-a0e7-1ca1d09832b9-default-certificate\") pod \"router-default-5444994796-8ng22\" (UID: \"22ac0012-d7d1-4b53-a0e7-1ca1d09832b9\") " pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847613 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ba8d87f-de60-4d9c-8f9a-c6a38cd73858-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8k8z6\" (UID: \"5ba8d87f-de60-4d9c-8f9a-c6a38cd73858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8k8z6" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847639 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79f1ffd5-bf7e-4303-81c6-72cd48e9ae46-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lt8cm\" (UID: \"79f1ffd5-bf7e-4303-81c6-72cd48e9ae46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lt8cm" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847661 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/22ac0012-d7d1-4b53-a0e7-1ca1d09832b9-stats-auth\") pod \"router-default-5444994796-8ng22\" (UID: \"22ac0012-d7d1-4b53-a0e7-1ca1d09832b9\") " pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847681 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d478d486-3666-4aae-967f-36d60f4de521-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zmcr4\" (UID: \"d478d486-3666-4aae-967f-36d60f4de521\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zmcr4" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847720 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/50d4d4c8-66e1-4d10-85cb-0fee6079d5fe-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nm2f4\" (UID: \"50d4d4c8-66e1-4d10-85cb-0fee6079d5fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nm2f4" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847750 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847771 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d20e2eeb-837d-4299-b1ce-2044781226f0-mountpoint-dir\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847806 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70c23dd9-b165-4f35-9230-da18a16f48be-config-volume\") pod \"collect-profiles-29411115-gzjfg\" (UID: \"70c23dd9-b165-4f35-9230-da18a16f48be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847826 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c00ba40-b027-444f-aa39-f8f0cbbd13cd-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jnbdf\" (UID: \"9c00ba40-b027-444f-aa39-f8f0cbbd13cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847863 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a4f5362-7618-4d15-8be9-465e83bdb0b9-serving-cert\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847884 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff4d4965-cf17-4d57-8820-fec4107ba62d-webhook-cert\") pod \"packageserver-d55dfcdfc-zlpfg\" (UID: \"ff4d4965-cf17-4d57-8820-fec4107ba62d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847906 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da84b89e-515f-4595-badf-a13b1ce0342a-trusted-ca\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.847961 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-registry-tls\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.848543 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a4f5362-7618-4d15-8be9-465e83bdb0b9-config\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.849051 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22ac0012-d7d1-4b53-a0e7-1ca1d09832b9-service-ca-bundle\") pod \"router-default-5444994796-8ng22\" (UID: \"22ac0012-d7d1-4b53-a0e7-1ca1d09832b9\") " pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.850122 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ba8d87f-de60-4d9c-8f9a-c6a38cd73858-config\") pod \"kube-controller-manager-operator-78b949d7b-8k8z6\" (UID: \"5ba8d87f-de60-4d9c-8f9a-c6a38cd73858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8k8z6" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.851782 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70c23dd9-b165-4f35-9230-da18a16f48be-config-volume\") pod \"collect-profiles-29411115-gzjfg\" (UID: \"70c23dd9-b165-4f35-9230-da18a16f48be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" Dec 02 09:22:54 crc kubenswrapper[4781]: E1202 09:22:54.852640 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:55.352624418 +0000 UTC m=+138.176498377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.853510 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f1ffd5-bf7e-4303-81c6-72cd48e9ae46-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lt8cm\" (UID: \"79f1ffd5-bf7e-4303-81c6-72cd48e9ae46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lt8cm" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.853584 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da84b89e-515f-4595-badf-a13b1ce0342a-trusted-ca\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.854024 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a4f5362-7618-4d15-8be9-465e83bdb0b9-etcd-service-ca\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.855517 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c00ba40-b027-444f-aa39-f8f0cbbd13cd-trusted-ca\") pod \"ingress-operator-5b745b69d9-jnbdf\" (UID: \"9c00ba40-b027-444f-aa39-f8f0cbbd13cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.863793 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-registry-tls\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.873509 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70c23dd9-b165-4f35-9230-da18a16f48be-secret-volume\") pod \"collect-profiles-29411115-gzjfg\" (UID: \"70c23dd9-b165-4f35-9230-da18a16f48be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.874033 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2a4f5362-7618-4d15-8be9-465e83bdb0b9-etcd-ca\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.874906 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9c00ba40-b027-444f-aa39-f8f0cbbd13cd-metrics-tls\") pod \"ingress-operator-5b745b69d9-jnbdf\" (UID: \"9c00ba40-b027-444f-aa39-f8f0cbbd13cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.875371 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/22ac0012-d7d1-4b53-a0e7-1ca1d09832b9-default-certificate\") pod \"router-default-5444994796-8ng22\" (UID: \"22ac0012-d7d1-4b53-a0e7-1ca1d09832b9\") " pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.875471 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc3224e4-715f-4f5d-99a0-5325d9b87fb8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lbdsz\" (UID: \"dc3224e4-715f-4f5d-99a0-5325d9b87fb8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbdsz" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.876384 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da84b89e-515f-4595-badf-a13b1ce0342a-registry-certificates\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.880849 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da84b89e-515f-4595-badf-a13b1ce0342a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.880972 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22ac0012-d7d1-4b53-a0e7-1ca1d09832b9-metrics-certs\") pod \"router-default-5444994796-8ng22\" (UID: \"22ac0012-d7d1-4b53-a0e7-1ca1d09832b9\") " pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.881078 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glwth\" (UniqueName: \"kubernetes.io/projected/9c00ba40-b027-444f-aa39-f8f0cbbd13cd-kube-api-access-glwth\") pod \"ingress-operator-5b745b69d9-jnbdf\" (UID: \"9c00ba40-b027-444f-aa39-f8f0cbbd13cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.881446 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e5395e3-0578-419d-8297-bdcf2cc6ae01-srv-cert\") pod \"catalog-operator-68c6474976-wjx4q\" (UID: \"6e5395e3-0578-419d-8297-bdcf2cc6ae01\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.881779 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a4f5362-7618-4d15-8be9-465e83bdb0b9-serving-cert\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.881910 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79f1ffd5-bf7e-4303-81c6-72cd48e9ae46-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lt8cm\" (UID: \"79f1ffd5-bf7e-4303-81c6-72cd48e9ae46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lt8cm" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.882131 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d478d486-3666-4aae-967f-36d60f4de521-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zmcr4\" (UID: \"d478d486-3666-4aae-967f-36d60f4de521\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zmcr4" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.883569 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6e5395e3-0578-419d-8297-bdcf2cc6ae01-profile-collector-cert\") pod \"catalog-operator-68c6474976-wjx4q\" (UID: \"6e5395e3-0578-419d-8297-bdcf2cc6ae01\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.883709 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pc7m\" (UniqueName: \"kubernetes.io/projected/70c23dd9-b165-4f35-9230-da18a16f48be-kube-api-access-4pc7m\") pod \"collect-profiles-29411115-gzjfg\" (UID: \"70c23dd9-b165-4f35-9230-da18a16f48be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.884500 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/22ac0012-d7d1-4b53-a0e7-1ca1d09832b9-stats-auth\") pod \"router-default-5444994796-8ng22\" (UID: \"22ac0012-d7d1-4b53-a0e7-1ca1d09832b9\") " pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.884804 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2a4f5362-7618-4d15-8be9-465e83bdb0b9-etcd-client\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.884990 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/50d4d4c8-66e1-4d10-85cb-0fee6079d5fe-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nm2f4\" (UID: \"50d4d4c8-66e1-4d10-85cb-0fee6079d5fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nm2f4" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.885566 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ns44\" (UniqueName: \"kubernetes.io/projected/6e5395e3-0578-419d-8297-bdcf2cc6ae01-kube-api-access-7ns44\") pod \"catalog-operator-68c6474976-wjx4q\" (UID: \"6e5395e3-0578-419d-8297-bdcf2cc6ae01\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.886181 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsr9q\" (UniqueName: \"kubernetes.io/projected/2a4f5362-7618-4d15-8be9-465e83bdb0b9-kube-api-access-tsr9q\") pod \"etcd-operator-b45778765-tr4x7\" (UID: \"2a4f5362-7618-4d15-8be9-465e83bdb0b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.888485 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ba8d87f-de60-4d9c-8f9a-c6a38cd73858-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8k8z6\" (UID: \"5ba8d87f-de60-4d9c-8f9a-c6a38cd73858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8k8z6" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.890060 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ba8d87f-de60-4d9c-8f9a-c6a38cd73858-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8k8z6\" (UID: \"5ba8d87f-de60-4d9c-8f9a-c6a38cd73858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8k8z6" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.896683 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8k8z6" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.903229 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h8fq\" (UniqueName: \"kubernetes.io/projected/22ac0012-d7d1-4b53-a0e7-1ca1d09832b9-kube-api-access-4h8fq\") pod \"router-default-5444994796-8ng22\" (UID: \"22ac0012-d7d1-4b53-a0e7-1ca1d09832b9\") " pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.903304 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lzst\" (UniqueName: \"kubernetes.io/projected/50d4d4c8-66e1-4d10-85cb-0fee6079d5fe-kube-api-access-9lzst\") pod \"control-plane-machine-set-operator-78cbb6b69f-nm2f4\" (UID: \"50d4d4c8-66e1-4d10-85cb-0fee6079d5fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nm2f4" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.912539 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.925574 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.928334 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79f1ffd5-bf7e-4303-81c6-72cd48e9ae46-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lt8cm\" (UID: \"79f1ffd5-bf7e-4303-81c6-72cd48e9ae46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lt8cm" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.934580 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lt8cm" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.952893 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953104 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b488n\" (UniqueName: \"kubernetes.io/projected/822c4f3b-992d-4ac5-aeb5-cc70cce2b305-kube-api-access-b488n\") pod \"dns-default-lhfvc\" (UID: \"822c4f3b-992d-4ac5-aeb5-cc70cce2b305\") " pod="openshift-dns/dns-default-lhfvc" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953139 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84cgn\" (UniqueName: \"kubernetes.io/projected/ff4d4965-cf17-4d57-8820-fec4107ba62d-kube-api-access-84cgn\") pod \"packageserver-d55dfcdfc-zlpfg\" (UID: \"ff4d4965-cf17-4d57-8820-fec4107ba62d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953164 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwspl\" (UniqueName: \"kubernetes.io/projected/d20e2eeb-837d-4299-b1ce-2044781226f0-kube-api-access-nwspl\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953197 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f1a331e-d7ef-4ab6-82e3-a5579353a37d-cert\") pod \"ingress-canary-wvgpb\" (UID: \"2f1a331e-d7ef-4ab6-82e3-a5579353a37d\") " pod="openshift-ingress-canary/ingress-canary-wvgpb" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953228 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d20e2eeb-837d-4299-b1ce-2044781226f0-csi-data-dir\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953254 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/822c4f3b-992d-4ac5-aeb5-cc70cce2b305-config-volume\") pod \"dns-default-lhfvc\" (UID: \"822c4f3b-992d-4ac5-aeb5-cc70cce2b305\") " pod="openshift-dns/dns-default-lhfvc" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953288 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ff4d4965-cf17-4d57-8820-fec4107ba62d-tmpfs\") pod \"packageserver-d55dfcdfc-zlpfg\" (UID: \"ff4d4965-cf17-4d57-8820-fec4107ba62d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953315 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1357c532-9360-4644-969d-9cc0407a1569-signing-key\") pod \"service-ca-9c57cc56f-sjskm\" (UID: \"1357c532-9360-4644-969d-9cc0407a1569\") " pod="openshift-service-ca/service-ca-9c57cc56f-sjskm" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953352 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ca4125e-8294-4a1c-9685-23022db2999b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-n6m8q\" (UID: \"5ca4125e-8294-4a1c-9685-23022db2999b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n6m8q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953389 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d20e2eeb-837d-4299-b1ce-2044781226f0-mountpoint-dir\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953432 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff4d4965-cf17-4d57-8820-fec4107ba62d-webhook-cert\") pod \"packageserver-d55dfcdfc-zlpfg\" (UID: \"ff4d4965-cf17-4d57-8820-fec4107ba62d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953490 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c3b35e49-827a-4445-b46f-ca74f65450c5-node-bootstrap-token\") pod \"machine-config-server-vvqhq\" (UID: \"c3b35e49-827a-4445-b46f-ca74f65450c5\") " pod="openshift-machine-config-operator/machine-config-server-vvqhq" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953515 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmk8r\" (UniqueName: \"kubernetes.io/projected/1357c532-9360-4644-969d-9cc0407a1569-kube-api-access-bmk8r\") pod \"service-ca-9c57cc56f-sjskm\" (UID: \"1357c532-9360-4644-969d-9cc0407a1569\") " pod="openshift-service-ca/service-ca-9c57cc56f-sjskm" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953552 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff4d4965-cf17-4d57-8820-fec4107ba62d-apiservice-cert\") pod \"packageserver-d55dfcdfc-zlpfg\" (UID: \"ff4d4965-cf17-4d57-8820-fec4107ba62d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953572 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d20e2eeb-837d-4299-b1ce-2044781226f0-registration-dir\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953596 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d20e2eeb-837d-4299-b1ce-2044781226f0-socket-dir\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953619 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/822c4f3b-992d-4ac5-aeb5-cc70cce2b305-metrics-tls\") pod \"dns-default-lhfvc\" (UID: \"822c4f3b-992d-4ac5-aeb5-cc70cce2b305\") " pod="openshift-dns/dns-default-lhfvc" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953663 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v49v\" (UniqueName: \"kubernetes.io/projected/2f1a331e-d7ef-4ab6-82e3-a5579353a37d-kube-api-access-6v49v\") pod \"ingress-canary-wvgpb\" (UID: \"2f1a331e-d7ef-4ab6-82e3-a5579353a37d\") " pod="openshift-ingress-canary/ingress-canary-wvgpb" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953688 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c3b35e49-827a-4445-b46f-ca74f65450c5-certs\") pod \"machine-config-server-vvqhq\" (UID: \"c3b35e49-827a-4445-b46f-ca74f65450c5\") " pod="openshift-machine-config-operator/machine-config-server-vvqhq" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953711 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb7l5\" (UniqueName: \"kubernetes.io/projected/c3b35e49-827a-4445-b46f-ca74f65450c5-kube-api-access-wb7l5\") pod \"machine-config-server-vvqhq\" (UID: \"c3b35e49-827a-4445-b46f-ca74f65450c5\") " pod="openshift-machine-config-operator/machine-config-server-vvqhq" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953748 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ca4125e-8294-4a1c-9685-23022db2999b-config\") pod \"kube-apiserver-operator-766d6c64bb-n6m8q\" (UID: \"5ca4125e-8294-4a1c-9685-23022db2999b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n6m8q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953775 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1357c532-9360-4644-969d-9cc0407a1569-signing-cabundle\") pod \"service-ca-9c57cc56f-sjskm\" (UID: \"1357c532-9360-4644-969d-9cc0407a1569\") " pod="openshift-service-ca/service-ca-9c57cc56f-sjskm" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953795 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ca4125e-8294-4a1c-9685-23022db2999b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-n6m8q\" (UID: \"5ca4125e-8294-4a1c-9685-23022db2999b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n6m8q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953830 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d20e2eeb-837d-4299-b1ce-2044781226f0-plugins-dir\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.953988 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c00ba40-b027-444f-aa39-f8f0cbbd13cd-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jnbdf\" (UID: \"9c00ba40-b027-444f-aa39-f8f0cbbd13cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.954172 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d20e2eeb-837d-4299-b1ce-2044781226f0-plugins-dir\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.954844 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d20e2eeb-837d-4299-b1ce-2044781226f0-registration-dir\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.954865 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d20e2eeb-837d-4299-b1ce-2044781226f0-socket-dir\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.955960 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ca4125e-8294-4a1c-9685-23022db2999b-config\") pod \"kube-apiserver-operator-766d6c64bb-n6m8q\" (UID: \"5ca4125e-8294-4a1c-9685-23022db2999b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n6m8q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.956912 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1357c532-9360-4644-969d-9cc0407a1569-signing-cabundle\") pod \"service-ca-9c57cc56f-sjskm\" (UID: \"1357c532-9360-4644-969d-9cc0407a1569\") " pod="openshift-service-ca/service-ca-9c57cc56f-sjskm" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.957432 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d20e2eeb-837d-4299-b1ce-2044781226f0-csi-data-dir\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:54 crc kubenswrapper[4781]: E1202 09:22:54.957521 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:55.457503296 +0000 UTC m=+138.281377245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.960543 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff4d4965-cf17-4d57-8820-fec4107ba62d-webhook-cert\") pod \"packageserver-d55dfcdfc-zlpfg\" (UID: \"ff4d4965-cf17-4d57-8820-fec4107ba62d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.960762 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d20e2eeb-837d-4299-b1ce-2044781226f0-mountpoint-dir\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.962813 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/822c4f3b-992d-4ac5-aeb5-cc70cce2b305-config-volume\") pod \"dns-default-lhfvc\" (UID: \"822c4f3b-992d-4ac5-aeb5-cc70cce2b305\") " pod="openshift-dns/dns-default-lhfvc" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.964056 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ff4d4965-cf17-4d57-8820-fec4107ba62d-tmpfs\") pod \"packageserver-d55dfcdfc-zlpfg\" (UID: \"ff4d4965-cf17-4d57-8820-fec4107ba62d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.967202 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzvnd\" (UniqueName: \"kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-kube-api-access-bzvnd\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.969039 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f1a331e-d7ef-4ab6-82e3-a5579353a37d-cert\") pod \"ingress-canary-wvgpb\" (UID: \"2f1a331e-d7ef-4ab6-82e3-a5579353a37d\") " pod="openshift-ingress-canary/ingress-canary-wvgpb" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.970374 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/822c4f3b-992d-4ac5-aeb5-cc70cce2b305-metrics-tls\") pod \"dns-default-lhfvc\" (UID: \"822c4f3b-992d-4ac5-aeb5-cc70cce2b305\") " pod="openshift-dns/dns-default-lhfvc" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.972504 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c3b35e49-827a-4445-b46f-ca74f65450c5-certs\") pod \"machine-config-server-vvqhq\" (UID: \"c3b35e49-827a-4445-b46f-ca74f65450c5\") " pod="openshift-machine-config-operator/machine-config-server-vvqhq" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.973188 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ca4125e-8294-4a1c-9685-23022db2999b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-n6m8q\" (UID: \"5ca4125e-8294-4a1c-9685-23022db2999b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n6m8q" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.974113 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff4d4965-cf17-4d57-8820-fec4107ba62d-apiservice-cert\") pod \"packageserver-d55dfcdfc-zlpfg\" (UID: \"ff4d4965-cf17-4d57-8820-fec4107ba62d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.975036 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c3b35e49-827a-4445-b46f-ca74f65450c5-node-bootstrap-token\") pod \"machine-config-server-vvqhq\" (UID: \"c3b35e49-827a-4445-b46f-ca74f65450c5\") " pod="openshift-machine-config-operator/machine-config-server-vvqhq" Dec 02 09:22:54 crc kubenswrapper[4781]: I1202 09:22:54.978965 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1357c532-9360-4644-969d-9cc0407a1569-signing-key\") pod \"service-ca-9c57cc56f-sjskm\" (UID: \"1357c532-9360-4644-969d-9cc0407a1569\") " pod="openshift-service-ca/service-ca-9c57cc56f-sjskm" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.014711 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdcq7\" (UniqueName: \"kubernetes.io/projected/dc3224e4-715f-4f5d-99a0-5325d9b87fb8-kube-api-access-tdcq7\") pod \"package-server-manager-789f6589d5-lbdsz\" (UID: \"dc3224e4-715f-4f5d-99a0-5325d9b87fb8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbdsz" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.020638 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-bound-sa-token\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.043782 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vz8g\" (UniqueName: \"kubernetes.io/projected/d478d486-3666-4aae-967f-36d60f4de521-kube-api-access-2vz8g\") pod \"multus-admission-controller-857f4d67dd-zmcr4\" (UID: \"d478d486-3666-4aae-967f-36d60f4de521\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zmcr4" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.046242 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.054866 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:55 crc kubenswrapper[4781]: E1202 09:22:55.055238 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:55.555224773 +0000 UTC m=+138.379098652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.063514 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v49v\" (UniqueName: \"kubernetes.io/projected/2f1a331e-d7ef-4ab6-82e3-a5579353a37d-kube-api-access-6v49v\") pod \"ingress-canary-wvgpb\" (UID: \"2f1a331e-d7ef-4ab6-82e3-a5579353a37d\") " pod="openshift-ingress-canary/ingress-canary-wvgpb" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.094854 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-krhf5"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.117500 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmk8r\" (UniqueName: \"kubernetes.io/projected/1357c532-9360-4644-969d-9cc0407a1569-kube-api-access-bmk8r\") pod \"service-ca-9c57cc56f-sjskm\" (UID: \"1357c532-9360-4644-969d-9cc0407a1569\") " pod="openshift-service-ca/service-ca-9c57cc56f-sjskm" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.127526 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb7l5\" (UniqueName: \"kubernetes.io/projected/c3b35e49-827a-4445-b46f-ca74f65450c5-kube-api-access-wb7l5\") pod \"machine-config-server-vvqhq\" (UID: \"c3b35e49-827a-4445-b46f-ca74f65450c5\") " pod="openshift-machine-config-operator/machine-config-server-vvqhq" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.127542 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b488n\" (UniqueName: \"kubernetes.io/projected/822c4f3b-992d-4ac5-aeb5-cc70cce2b305-kube-api-access-b488n\") pod \"dns-default-lhfvc\" (UID: \"822c4f3b-992d-4ac5-aeb5-cc70cce2b305\") " pod="openshift-dns/dns-default-lhfvc" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.141026 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dpqgc"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.149533 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.152387 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xqr9t"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.155851 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:55 crc kubenswrapper[4781]: E1202 09:22:55.156484 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:55.656466799 +0000 UTC m=+138.480340678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.160434 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zmcr4" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.164991 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" event={"ID":"71324519-4199-4389-87b9-705916c2c5c4","Type":"ContainerStarted","Data":"8df2231b321a978dabbda75b42631b63c4bd581f06e06a5257d68826a651f4d0"} Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.168768 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gpq68" event={"ID":"b43f11b4-7580-463e-9543-2d38de346fe8","Type":"ContainerStarted","Data":"64c1f381116e4c140fc28a3cc684d09e105aaf1690b7c4609abfdd6056e21fac"} Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.171165 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qq8rf" event={"ID":"88747619-18ec-4e3e-9a0d-1f00bc7c2038","Type":"ContainerStarted","Data":"712ff94323b2b922cc24113c87f9048df1f0826580bf0b5da5426e1762d62041"} Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.173699 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-krhf5" event={"ID":"f87709bf-590d-4318-abdb-7ecb8a86e303","Type":"ContainerStarted","Data":"2f59cb79d7444fe7bdcf62475d54e7a5fec4e3054a42f8373566ab4764a318ce"} Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.174044 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84cgn\" (UniqueName: \"kubernetes.io/projected/ff4d4965-cf17-4d57-8820-fec4107ba62d-kube-api-access-84cgn\") pod \"packageserver-d55dfcdfc-zlpfg\" (UID: \"ff4d4965-cf17-4d57-8820-fec4107ba62d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.175234 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nm2f4" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.180666 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8ng22" event={"ID":"22ac0012-d7d1-4b53-a0e7-1ca1d09832b9","Type":"ContainerStarted","Data":"83e38d359c8c754d43f02a53ea248b0459cbaad2ccc329939848e52a2b3ae635"} Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.184967 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.186475 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwspl\" (UniqueName: \"kubernetes.io/projected/d20e2eeb-837d-4299-b1ce-2044781226f0-kube-api-access-nwspl\") pod \"csi-hostpathplugin-6c7fp\" (UID: \"d20e2eeb-837d-4299-b1ce-2044781226f0\") " pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.188725 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ca4125e-8294-4a1c-9685-23022db2999b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-n6m8q\" (UID: \"5ca4125e-8294-4a1c-9685-23022db2999b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n6m8q" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.190394 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" event={"ID":"d9c52f13-f9c6-419e-8f69-ee91e29f4629","Type":"ContainerStarted","Data":"f87ed35b253419f0a5d185149ec4244ea4067a4f59cfd5404f6a633616c85005"} Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.191709 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbdsz" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.194890 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" event={"ID":"d88ea951-0553-4510-b1cd-bd6196d1f973","Type":"ContainerStarted","Data":"fc16cc58580d12011f361c8c16f73159da7b74ac7b7263336839f61b609ef7fa"} Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.195755 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f" event={"ID":"b9509647-5e74-4452-a998-f0f699160a70","Type":"ContainerStarted","Data":"39c88b75644cc570c371c8fb7fcf477867dd867b43eefea762f9373d0fd036b5"} Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.197613 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" event={"ID":"a635da40-fbc4-4225-a755-dcab98f66a76","Type":"ContainerStarted","Data":"ae55a80a3ebfb8db59ea426da4197295c43c5776965e0e6e2d4c33bc8c610eb9"} Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.199647 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zkswz" event={"ID":"e1b68c39-405a-4419-af7e-4d9bea0189c3","Type":"ContainerStarted","Data":"78278dda3deda9c53147090f3f0ed7b5983a42439a420486689b65a83fb14b39"} Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.200995 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl" event={"ID":"caa8272b-7968-4b09-b3a7-4aa6d4f12f2b","Type":"ContainerStarted","Data":"5fd0d81f847c6e09911299ba1c4b5d76bfeb205ec77518029e6d1ef75f7c7364"} Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.204005 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.241058 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-sjskm" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.249274 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n6m8q" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.260190 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.260397 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:55 crc kubenswrapper[4781]: E1202 09:22:55.261293 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:55.761276856 +0000 UTC m=+138.585150815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.277096 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.287023 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wvgpb" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.299260 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lhfvc" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.315851 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vvqhq" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.361593 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:55 crc kubenswrapper[4781]: E1202 09:22:55.362759 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:55.862742007 +0000 UTC m=+138.686615876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.399046 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.409063 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-9k9bd"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.411389 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6wrt7"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.417495 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p8ljp"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.464326 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:55 crc kubenswrapper[4781]: E1202 09:22:55.464685 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:55.964667602 +0000 UTC m=+138.788541481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.564937 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:55 crc kubenswrapper[4781]: E1202 09:22:55.565271 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:56.0652295 +0000 UTC m=+138.889103409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.565472 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:55 crc kubenswrapper[4781]: E1202 09:22:55.565877 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:56.065859957 +0000 UTC m=+138.889733836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.590536 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wvc5b"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.614676 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gltzh"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.626812 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x4lq4"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.667207 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:55 crc kubenswrapper[4781]: E1202 09:22:55.668247 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:56.168225124 +0000 UTC m=+138.992099003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.696697 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lt8cm"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.709577 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8k8z6"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.727317 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.740086 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.744967 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-r42hh"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.779006 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:55 crc kubenswrapper[4781]: E1202 09:22:55.779561 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:56.279544053 +0000 UTC m=+139.103417932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.780819 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.786874 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl"] Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.880986 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:55 crc kubenswrapper[4781]: E1202 09:22:55.881184 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:56.381156459 +0000 UTC m=+139.205030328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.881344 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:55 crc kubenswrapper[4781]: E1202 09:22:55.881751 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:56.381734735 +0000 UTC m=+139.205608634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.916565 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qq8rf" podStartSLOduration=119.916542901 podStartE2EDuration="1m59.916542901s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:22:55.914440561 +0000 UTC m=+138.738314430" watchObservedRunningTime="2025-12-02 09:22:55.916542901 +0000 UTC m=+138.740416780" Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.982867 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:55 crc kubenswrapper[4781]: E1202 09:22:55.983094 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:56.483066143 +0000 UTC m=+139.306940032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:55 crc kubenswrapper[4781]: I1202 09:22:55.983233 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:55 crc kubenswrapper[4781]: E1202 09:22:55.983648 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:56.483632179 +0000 UTC m=+139.307506058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.084319 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:56 crc kubenswrapper[4781]: E1202 09:22:56.084475 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:56.584424192 +0000 UTC m=+139.408298091 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.084720 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:56 crc kubenswrapper[4781]: E1202 09:22:56.085159 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:56.585134943 +0000 UTC m=+139.409008822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.186825 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:56 crc kubenswrapper[4781]: E1202 09:22:56.187004 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:56.686984076 +0000 UTC m=+139.510857955 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.187370 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:56 crc kubenswrapper[4781]: E1202 09:22:56.187696 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:56.687689295 +0000 UTC m=+139.511563174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.198042 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zkswz" podStartSLOduration=120.198024485 podStartE2EDuration="2m0.198024485s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:22:56.196344577 +0000 UTC m=+139.020218466" watchObservedRunningTime="2025-12-02 09:22:56.198024485 +0000 UTC m=+139.021898364" Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.208553 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" event={"ID":"4297bd58-caf6-4962-b775-7f454787fa91","Type":"ContainerStarted","Data":"8110030450aa8b62bc6b932412e5a5f6c1844c01337d710f47b7983a0bbf3ddf"} Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.209615 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" event={"ID":"cef8c4dd-6e0f-44c9-8926-72b0d821f823","Type":"ContainerStarted","Data":"6f1dc40bcf8b212a603ef52c7b16bbcd516a6d70c3c596eda85949d8d4401539"} Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.210560 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gltzh" event={"ID":"36103c10-f353-4b13-8c7a-eeee5e3b4f44","Type":"ContainerStarted","Data":"98f08141268290752a9d53411584e493906129a15312ea2f0c708fa74a58c7c2"} Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.211419 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" event={"ID":"04a6248c-9bb7-4204-a19a-1041d4d06f3e","Type":"ContainerStarted","Data":"cbb73eb94ebb1d2690cb048252735563bba8622df9f80f7990d6274f4f216e2b"} Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.212244 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dpqgc" event={"ID":"83b6450c-e497-4d31-9c9a-e36fbe675a2e","Type":"ContainerStarted","Data":"3e74a956cb186645d47a0a09b832662a2a4291802a8e194c21fba9f556082bc6"} Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.213151 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" event={"ID":"b7602882-7ecf-4ce4-9cfa-3a691b3f9270","Type":"ContainerStarted","Data":"6cd9150045b3a149957ee05f6251338340bbdeff9bb96634f5abadbd8f1bc942"} Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.213964 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvc5b" event={"ID":"69ddef2d-4afd-4668-b95f-29137b133855","Type":"ContainerStarted","Data":"d91237c88495e0aba6dfa6bd2386f5640095821362be29fa321e00933b0b9df6"} Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.214743 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" event={"ID":"c0570ff6-6102-40ae-a68a-b35b77756097","Type":"ContainerStarted","Data":"018bb247284cfd4aa039f8bf3901092031cf2dbf9f30975c35056273833d0424"} Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.216680 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p8ljp" event={"ID":"be701e11-18c0-4541-9685-48bd65c661f2","Type":"ContainerStarted","Data":"5b8c4e21cf2a33a874e8f420791c7c447343f68de392b35e9cd1d5c8efe46478"} Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.288996 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:56 crc kubenswrapper[4781]: E1202 09:22:56.289402 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:56.789385074 +0000 UTC m=+139.613258953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.304283 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zmcr4"] Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.308990 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lhfvc"] Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.309167 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tr4x7"] Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.322348 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6c7fp"] Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.390076 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:56 crc kubenswrapper[4781]: E1202 09:22:56.392634 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:56.892615436 +0000 UTC m=+139.716489375 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:56 crc kubenswrapper[4781]: W1202 09:22:56.395600 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ba8d87f_de60_4d9c_8f9a_c6a38cd73858.slice/crio-ff7774304031cc10ad95428ae476c2843173b379a860110fa56c561df04131ca WatchSource:0}: Error finding container ff7774304031cc10ad95428ae476c2843173b379a860110fa56c561df04131ca: Status 404 returned error can't find the container with id ff7774304031cc10ad95428ae476c2843173b379a860110fa56c561df04131ca Dec 02 09:22:56 crc kubenswrapper[4781]: W1202 09:22:56.398995 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a8404c_874a_43e4_917a_604110280bd7.slice/crio-94424085ecc775d8596c4a5557126eafa0a0a5bae9d39f53a1624590b3ab55f3 WatchSource:0}: Error finding container 94424085ecc775d8596c4a5557126eafa0a0a5bae9d39f53a1624590b3ab55f3: Status 404 returned error can't find the container with id 94424085ecc775d8596c4a5557126eafa0a0a5bae9d39f53a1624590b3ab55f3 Dec 02 09:22:56 crc kubenswrapper[4781]: W1202 09:22:56.446473 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd478d486_3666_4aae_967f_36d60f4de521.slice/crio-fa54592a5884d115701e5d5d04a84322283cc114010dbe8fe57ea04347e3cb55 WatchSource:0}: Error finding container fa54592a5884d115701e5d5d04a84322283cc114010dbe8fe57ea04347e3cb55: Status 404 returned error can't find the container with id fa54592a5884d115701e5d5d04a84322283cc114010dbe8fe57ea04347e3cb55 Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.495011 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:56 crc kubenswrapper[4781]: E1202 09:22:56.495697 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:56.995674672 +0000 UTC m=+139.819548551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.599346 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:56 crc kubenswrapper[4781]: E1202 09:22:56.599888 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:57.099845861 +0000 UTC m=+139.923719740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.652457 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf"] Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.696902 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbdsz"] Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.700441 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:56 crc kubenswrapper[4781]: E1202 09:22:56.700743 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:57.200725217 +0000 UTC m=+140.024599096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.730254 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg"] Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.777142 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sjskm"] Dec 02 09:22:56 crc kubenswrapper[4781]: W1202 09:22:56.789180 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc3224e4_715f_4f5d_99a0_5325d9b87fb8.slice/crio-b96ee3ae40c320b76ccf0bc0e5239f26e79abdec8b1f704284226a1423923da7 WatchSource:0}: Error finding container b96ee3ae40c320b76ccf0bc0e5239f26e79abdec8b1f704284226a1423923da7: Status 404 returned error can't find the container with id b96ee3ae40c320b76ccf0bc0e5239f26e79abdec8b1f704284226a1423923da7 Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.804497 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:56 crc kubenswrapper[4781]: E1202 09:22:56.804989 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:57.304974406 +0000 UTC m=+140.128848295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.905793 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:56 crc kubenswrapper[4781]: E1202 09:22:56.906245 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:57.406230763 +0000 UTC m=+140.230104642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.966071 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg"] Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.971433 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wvgpb"] Dec 02 09:22:56 crc kubenswrapper[4781]: I1202 09:22:56.973800 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nm2f4"] Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.011711 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:57 crc kubenswrapper[4781]: E1202 09:22:57.012113 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:57.512096838 +0000 UTC m=+140.335970717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:57 crc kubenswrapper[4781]: W1202 09:22:57.105090 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70c23dd9_b165_4f35_9230_da18a16f48be.slice/crio-014e3df0363556fbd084f4379078397d1a82139dbb2d47f739c459be813cf33a WatchSource:0}: Error finding container 014e3df0363556fbd084f4379078397d1a82139dbb2d47f739c459be813cf33a: Status 404 returned error can't find the container with id 014e3df0363556fbd084f4379078397d1a82139dbb2d47f739c459be813cf33a Dec 02 09:22:57 crc kubenswrapper[4781]: W1202 09:22:57.108510 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50d4d4c8_66e1_4d10_85cb_0fee6079d5fe.slice/crio-5b4b3f46414a7342b875294d27056a6f113651823e5e7348b0a05974b342d3f9 WatchSource:0}: Error finding container 5b4b3f46414a7342b875294d27056a6f113651823e5e7348b0a05974b342d3f9: Status 404 returned error can't find the container with id 5b4b3f46414a7342b875294d27056a6f113651823e5e7348b0a05974b342d3f9 Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.112274 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:57 crc kubenswrapper[4781]: E1202 09:22:57.112392 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:57.612371088 +0000 UTC m=+140.436244967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.112545 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:57 crc kubenswrapper[4781]: E1202 09:22:57.112867 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:57.612855211 +0000 UTC m=+140.436729090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:57 crc kubenswrapper[4781]: W1202 09:22:57.114503 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f1a331e_d7ef_4ab6_82e3_a5579353a37d.slice/crio-61cc0502bef655499ce4fb9e8222cc689fe663dd2bdf913e50f3bbeb9792c2ea WatchSource:0}: Error finding container 61cc0502bef655499ce4fb9e8222cc689fe663dd2bdf913e50f3bbeb9792c2ea: Status 404 returned error can't find the container with id 61cc0502bef655499ce4fb9e8222cc689fe663dd2bdf913e50f3bbeb9792c2ea Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.140765 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n6m8q"] Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.213574 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:57 crc kubenswrapper[4781]: E1202 09:22:57.213744 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:57.713715756 +0000 UTC m=+140.537589635 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.213892 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:57 crc kubenswrapper[4781]: E1202 09:22:57.214243 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:57.71423189 +0000 UTC m=+140.538105769 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.222728 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nm2f4" event={"ID":"50d4d4c8-66e1-4d10-85cb-0fee6079d5fe","Type":"ContainerStarted","Data":"5b4b3f46414a7342b875294d27056a6f113651823e5e7348b0a05974b342d3f9"} Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.224286 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vvqhq" event={"ID":"c3b35e49-827a-4445-b46f-ca74f65450c5","Type":"ContainerStarted","Data":"8b7430f325e92d6a156d20658a0f18d0b224a174d9946c813fa5a344a61c7218"} Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.226305 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" event={"ID":"71324519-4199-4389-87b9-705916c2c5c4","Type":"ContainerStarted","Data":"ecbca1e6237a7dcb3bb34375f25e6108a5d2fd328b14eb1e8cfcb6c0892d7ec3"} Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.228379 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbdsz" event={"ID":"dc3224e4-715f-4f5d-99a0-5325d9b87fb8","Type":"ContainerStarted","Data":"b96ee3ae40c320b76ccf0bc0e5239f26e79abdec8b1f704284226a1423923da7"} Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.314736 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:57 crc kubenswrapper[4781]: E1202 09:22:57.314886 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:57.814863429 +0000 UTC m=+140.638737318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.314944 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:57 crc kubenswrapper[4781]: E1202 09:22:57.315151 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:57.815142127 +0000 UTC m=+140.639016006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.415742 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:57 crc kubenswrapper[4781]: E1202 09:22:57.415912 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:57.915888149 +0000 UTC m=+140.739762028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.517010 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:57 crc kubenswrapper[4781]: E1202 09:22:57.517399 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:58.017386593 +0000 UTC m=+140.841260472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.617776 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:57 crc kubenswrapper[4781]: E1202 09:22:57.618108 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:58.118094234 +0000 UTC m=+140.941968113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.718869 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:57 crc kubenswrapper[4781]: E1202 09:22:57.719415 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:58.219396121 +0000 UTC m=+141.043270070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.820001 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:57 crc kubenswrapper[4781]: E1202 09:22:57.820405 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:58.32039202 +0000 UTC m=+141.144265899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:57 crc kubenswrapper[4781]: E1202 09:22:57.922687 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:58.422667075 +0000 UTC m=+141.246540994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:57 crc kubenswrapper[4781]: I1202 09:22:57.922254 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:58 crc kubenswrapper[4781]: I1202 09:22:58.024509 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:58 crc kubenswrapper[4781]: E1202 09:22:58.024709 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:58.524672512 +0000 UTC m=+141.348546431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:58 crc kubenswrapper[4781]: I1202 09:22:58.024766 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:58 crc kubenswrapper[4781]: E1202 09:22:58.025382 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:58.525359232 +0000 UTC m=+141.349233151 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:58 crc kubenswrapper[4781]: I1202 09:22:58.125444 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:58 crc kubenswrapper[4781]: E1202 09:22:58.125631 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:58.625602349 +0000 UTC m=+141.449476238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:58 crc kubenswrapper[4781]: I1202 09:22:58.125834 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:58 crc kubenswrapper[4781]: E1202 09:22:58.126150 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:58.626138905 +0000 UTC m=+141.450012784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:58 crc kubenswrapper[4781]: I1202 09:22:58.227128 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:58 crc kubenswrapper[4781]: E1202 09:22:58.227395 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:58.72736071 +0000 UTC m=+141.551234629 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:58 crc kubenswrapper[4781]: I1202 09:22:58.227501 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:58 crc kubenswrapper[4781]: E1202 09:22:58.227971 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:58.727952236 +0000 UTC m=+141.551826145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:58 crc kubenswrapper[4781]: I1202 09:22:58.328275 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:58 crc kubenswrapper[4781]: E1202 09:22:58.328507 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:58.828479392 +0000 UTC m=+141.652353311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:58 crc kubenswrapper[4781]: I1202 09:22:58.430418 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:58 crc kubenswrapper[4781]: E1202 09:22:58.430914 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:58.930895711 +0000 UTC m=+141.754769620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:58 crc kubenswrapper[4781]: I1202 09:22:58.531141 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:58 crc kubenswrapper[4781]: E1202 09:22:58.531509 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:59.031492769 +0000 UTC m=+141.855366648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:58 crc kubenswrapper[4781]: I1202 09:22:58.632818 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:58 crc kubenswrapper[4781]: E1202 09:22:58.633361 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:59.133330702 +0000 UTC m=+141.957204621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:58 crc kubenswrapper[4781]: I1202 09:22:58.733420 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:58 crc kubenswrapper[4781]: E1202 09:22:58.734009 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:59.233981062 +0000 UTC m=+142.057854981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:58 crc kubenswrapper[4781]: I1202 09:22:58.835064 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:58 crc kubenswrapper[4781]: E1202 09:22:58.835949 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:59.335904587 +0000 UTC m=+142.159778466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:58 crc kubenswrapper[4781]: I1202 09:22:58.936791 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:58 crc kubenswrapper[4781]: E1202 09:22:58.937042 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:59.437010929 +0000 UTC m=+142.260884808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:58 crc kubenswrapper[4781]: I1202 09:22:58.937108 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:58 crc kubenswrapper[4781]: E1202 09:22:58.937661 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:59.437653016 +0000 UTC m=+142.261526895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.038734 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:59 crc kubenswrapper[4781]: E1202 09:22:59.039087 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:59.539049827 +0000 UTC m=+142.362923746 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.039415 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:59 crc kubenswrapper[4781]: E1202 09:22:59.039902 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:59.53988488 +0000 UTC m=+142.363758799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.140293 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:59 crc kubenswrapper[4781]: E1202 09:22:59.140637 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:59.640619212 +0000 UTC m=+142.464493111 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.241684 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:59 crc kubenswrapper[4781]: E1202 09:22:59.242181 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:59.742161797 +0000 UTC m=+142.566035686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.342974 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:59 crc kubenswrapper[4781]: E1202 09:22:59.343608 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:22:59.843583497 +0000 UTC m=+142.667457406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.444652 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:59 crc kubenswrapper[4781]: E1202 09:22:59.445019 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:22:59.945007349 +0000 UTC m=+142.768881228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.546405 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:59 crc kubenswrapper[4781]: E1202 09:22:59.546594 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:00.046559663 +0000 UTC m=+142.870433582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.546833 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:59 crc kubenswrapper[4781]: E1202 09:22:59.547351 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:00.047335265 +0000 UTC m=+142.871209184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.579496 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7" event={"ID":"51a8404c-874a-43e4-917a-604110280bd7","Type":"ContainerStarted","Data":"94424085ecc775d8596c4a5557126eafa0a0a5bae9d39f53a1624590b3ab55f3"} Dec 02 09:22:59 crc kubenswrapper[4781]: W1202 09:22:59.580530 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ca4125e_8294_4a1c_9685_23022db2999b.slice/crio-da628cce1aef918713ddb93b64998d8cb049fc609029cc0573ad662fc8b752e1 WatchSource:0}: Error finding container da628cce1aef918713ddb93b64998d8cb049fc609029cc0573ad662fc8b752e1: Status 404 returned error can't find the container with id da628cce1aef918713ddb93b64998d8cb049fc609029cc0573ad662fc8b752e1 Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.580609 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" event={"ID":"70c23dd9-b165-4f35-9230-da18a16f48be","Type":"ContainerStarted","Data":"014e3df0363556fbd084f4379078397d1a82139dbb2d47f739c459be813cf33a"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.597894 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gpq68" event={"ID":"b43f11b4-7580-463e-9543-2d38de346fe8","Type":"ContainerStarted","Data":"ce8d05447a08db8ceeec4218528737a40871467479ddffde29d610431c11a38c"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.599518 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" event={"ID":"2a4f5362-7618-4d15-8be9-465e83bdb0b9","Type":"ContainerStarted","Data":"8655694f35f7aaffd80b2f9a2d213de228cbc9a61f2a7fcffd4175f5a2154d4c"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.603669 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" event={"ID":"9c00ba40-b027-444f-aa39-f8f0cbbd13cd","Type":"ContainerStarted","Data":"e3cf642bdd565854e79d15f6486ae38b1136c06ebf43517d34a62d045952a696"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.614783 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-krhf5" event={"ID":"f87709bf-590d-4318-abdb-7ecb8a86e303","Type":"ContainerStarted","Data":"89cd9afe5ba4030ad382157d1ed2db54bcc27a0cc186aadbef63b7f471db47b1"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.621850 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-sjskm" event={"ID":"1357c532-9360-4644-969d-9cc0407a1569","Type":"ContainerStarted","Data":"dfafcea2c9031f50b86b48e53aaf7b435159b5ce0561fa771cea0f8a8313f74d"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.639803 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" event={"ID":"d20e2eeb-837d-4299-b1ce-2044781226f0","Type":"ContainerStarted","Data":"6a54200b36b2a555392cc0c44b9cb4b87e87c7d7bcf589f38cf0380104be4e9d"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.641813 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" event={"ID":"11d24c84-34b9-46a0-9d24-65ca291b4ac6","Type":"ContainerStarted","Data":"3a54b6cbb36e3b114fb007d68eddf0ad7ce9790553af73598a2376894c65a3b6"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.648543 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:59 crc kubenswrapper[4781]: E1202 09:22:59.648743 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:00.148706364 +0000 UTC m=+142.972580273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.648881 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:59 crc kubenswrapper[4781]: E1202 09:22:59.649687 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:00.149667721 +0000 UTC m=+142.973541640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.651134 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lhfvc" event={"ID":"822c4f3b-992d-4ac5-aeb5-cc70cce2b305","Type":"ContainerStarted","Data":"99a0f71b9f995a35ed3e36b22cbb46676a38b11cde2d44d58b489baeb6637557"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.658875 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" event={"ID":"2be1529f-3b01-4174-b029-7312871f5b97","Type":"ContainerStarted","Data":"0236e756ac76500dfc2bad37fbc0485919411dbf55d389469b8ecbd72ea56aa1"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.668746 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-r42hh" event={"ID":"ee42da42-30b7-40ef-b00c-4d61e25502e0","Type":"ContainerStarted","Data":"df0074471220cc5231e236b55fae154010eb4dfa61bf7c9cdb7b791bfcfe749f"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.673141 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8k8z6" event={"ID":"5ba8d87f-de60-4d9c-8f9a-c6a38cd73858","Type":"ContainerStarted","Data":"ff7774304031cc10ad95428ae476c2843173b379a860110fa56c561df04131ca"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.675688 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lt8cm" event={"ID":"79f1ffd5-bf7e-4303-81c6-72cd48e9ae46","Type":"ContainerStarted","Data":"fb2e460ed6b474ef2545c715f34abe94dc9be3fd33bdc4d5675313bca9a1e8ee"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.677077 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4d2fb" event={"ID":"8a173fa2-3cb7-4f70-a04f-50a3131cb1ca","Type":"ContainerStarted","Data":"7b75e26ea5c56dc009502db35f94bb106977d901a18e307e613f285439f92d04"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.686495 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" event={"ID":"ff4d4965-cf17-4d57-8820-fec4107ba62d","Type":"ContainerStarted","Data":"f652a6e6aa835f7ff64f31394ccb9cb675a39ba5dcc101259088bed668df9fe1"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.688999 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl" event={"ID":"caa8272b-7968-4b09-b3a7-4aa6d4f12f2b","Type":"ContainerStarted","Data":"2d156742b7b32bfc739feed6fd2dbe476ea5459905f088e814569a1876c133da"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.694355 4781 generic.go:334] "Generic (PLEG): container finished" podID="b9509647-5e74-4452-a998-f0f699160a70" containerID="16043723d883dd57ac925aa3107685d0eca48e25e0a9c42181138c498f574a92" exitCode=0 Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.694515 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f" event={"ID":"b9509647-5e74-4452-a998-f0f699160a70","Type":"ContainerDied","Data":"16043723d883dd57ac925aa3107685d0eca48e25e0a9c42181138c498f574a92"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.697794 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8ng22" event={"ID":"22ac0012-d7d1-4b53-a0e7-1ca1d09832b9","Type":"ContainerStarted","Data":"5e52f931dfce9c0378ced45af9fe135cc2d598b86775b4449e548a4a9c45974d"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.718773 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" event={"ID":"a635da40-fbc4-4225-a755-dcab98f66a76","Type":"ContainerStarted","Data":"2fa6c131d633c537ab93b32332cc070db4bc64b285bf4991c66309b33e349699"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.722620 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q" event={"ID":"6e5395e3-0578-419d-8297-bdcf2cc6ae01","Type":"ContainerStarted","Data":"b53cdd17d8efa6afd6303ce2580b3f13759441d55e154b3cde4181157fbeaab1"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.746366 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zmcr4" event={"ID":"d478d486-3666-4aae-967f-36d60f4de521","Type":"ContainerStarted","Data":"fa54592a5884d115701e5d5d04a84322283cc114010dbe8fe57ea04347e3cb55"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.747573 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wvgpb" event={"ID":"2f1a331e-d7ef-4ab6-82e3-a5579353a37d","Type":"ContainerStarted","Data":"61cc0502bef655499ce4fb9e8222cc689fe663dd2bdf913e50f3bbeb9792c2ea"} Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.750654 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:59 crc kubenswrapper[4781]: E1202 09:22:59.750786 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:00.250767473 +0000 UTC m=+143.074641352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.751456 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:59 crc kubenswrapper[4781]: E1202 09:22:59.751890 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:00.251872874 +0000 UTC m=+143.075746753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.762081 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-xwf7r" podStartSLOduration=123.76206146 podStartE2EDuration="2m3.76206146s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:22:59.761228386 +0000 UTC m=+142.585102255" watchObservedRunningTime="2025-12-02 09:22:59.76206146 +0000 UTC m=+142.585935339" Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.852181 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:22:59 crc kubenswrapper[4781]: E1202 09:22:59.853289 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:00.353274004 +0000 UTC m=+143.177147883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:22:59 crc kubenswrapper[4781]: I1202 09:22:59.953879 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:22:59 crc kubenswrapper[4781]: E1202 09:22:59.954763 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:00.454737587 +0000 UTC m=+143.278611466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.055615 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:00 crc kubenswrapper[4781]: E1202 09:23:00.056078 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:00.556060015 +0000 UTC m=+143.379933894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.156958 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:00 crc kubenswrapper[4781]: E1202 09:23:00.157360 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:00.657347152 +0000 UTC m=+143.481221031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.259168 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:00 crc kubenswrapper[4781]: E1202 09:23:00.259343 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:00.759323579 +0000 UTC m=+143.583197458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.259437 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:00 crc kubenswrapper[4781]: E1202 09:23:00.259804 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:00.759791152 +0000 UTC m=+143.583665031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.360879 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:00 crc kubenswrapper[4781]: E1202 09:23:00.361054 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:00.861024208 +0000 UTC m=+143.684898087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.361181 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:00 crc kubenswrapper[4781]: E1202 09:23:00.361527 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:00.861513111 +0000 UTC m=+143.685386990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.412023 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.412130 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.462470 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:00 crc kubenswrapper[4781]: E1202 09:23:00.462803 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:00.962787668 +0000 UTC m=+143.786661547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.563560 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:00 crc kubenswrapper[4781]: E1202 09:23:00.563961 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:01.063944032 +0000 UTC m=+143.887817911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.664291 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:00 crc kubenswrapper[4781]: E1202 09:23:00.664442 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:01.164425016 +0000 UTC m=+143.988298895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.664544 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:00 crc kubenswrapper[4781]: E1202 09:23:00.665146 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:01.165129316 +0000 UTC m=+143.989003185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.755416 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n6m8q" event={"ID":"5ca4125e-8294-4a1c-9685-23022db2999b","Type":"ContainerStarted","Data":"da628cce1aef918713ddb93b64998d8cb049fc609029cc0573ad662fc8b752e1"} Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.766131 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:00 crc kubenswrapper[4781]: E1202 09:23:00.766454 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:01.266386992 +0000 UTC m=+144.090260871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.766721 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:00 crc kubenswrapper[4781]: E1202 09:23:00.767236 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:01.267224566 +0000 UTC m=+144.091098655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.788178 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-8ng22" podStartSLOduration=124.788146752 podStartE2EDuration="2m4.788146752s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:00.783315596 +0000 UTC m=+143.607189485" watchObservedRunningTime="2025-12-02 09:23:00.788146752 +0000 UTC m=+143.612020631" Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.817617 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-krhf5" podStartSLOduration=124.817585527 podStartE2EDuration="2m4.817585527s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:00.810304263 +0000 UTC m=+143.634178182" watchObservedRunningTime="2025-12-02 09:23:00.817585527 +0000 UTC m=+143.641459486" Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.867640 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:00 crc kubenswrapper[4781]: E1202 09:23:00.868234 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:01.368204104 +0000 UTC m=+144.192077983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.872725 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:00 crc kubenswrapper[4781]: E1202 09:23:00.873191 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:01.373172404 +0000 UTC m=+144.197046313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.926081 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.929052 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.929131 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.973431 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:00 crc kubenswrapper[4781]: E1202 09:23:00.973574 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:01.473545445 +0000 UTC m=+144.297419344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:00 crc kubenswrapper[4781]: I1202 09:23:00.974016 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:00 crc kubenswrapper[4781]: E1202 09:23:00.974585 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:01.474574024 +0000 UTC m=+144.298447923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.075404 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:01 crc kubenswrapper[4781]: E1202 09:23:01.075846 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:01.575792129 +0000 UTC m=+144.399666018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.177364 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:01 crc kubenswrapper[4781]: E1202 09:23:01.178011 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:01.677977911 +0000 UTC m=+144.501851980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.278404 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:01 crc kubenswrapper[4781]: E1202 09:23:01.278583 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:01.778550979 +0000 UTC m=+144.602424858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.278680 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:01 crc kubenswrapper[4781]: E1202 09:23:01.278991 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:01.778979711 +0000 UTC m=+144.602853590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.379561 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:01 crc kubenswrapper[4781]: E1202 09:23:01.379836 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:01.879763114 +0000 UTC m=+144.703637003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.380291 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:01 crc kubenswrapper[4781]: E1202 09:23:01.380740 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:01.880729701 +0000 UTC m=+144.704603650 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.480873 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:01 crc kubenswrapper[4781]: E1202 09:23:01.481718 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:01.981699789 +0000 UTC m=+144.805573668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.582512 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:01 crc kubenswrapper[4781]: E1202 09:23:01.582810 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:02.082799131 +0000 UTC m=+144.906673010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.683424 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:01 crc kubenswrapper[4781]: E1202 09:23:01.683568 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:02.183542243 +0000 UTC m=+145.007416122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.683661 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:01 crc kubenswrapper[4781]: E1202 09:23:01.683947 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:02.183938875 +0000 UTC m=+145.007812744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.761958 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-sjskm" event={"ID":"1357c532-9360-4644-969d-9cc0407a1569","Type":"ContainerStarted","Data":"eba616c5476c1a09a31e17c3e51970bdfaddc7a22853d3cb0bfa1cdf991ce246"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.763395 4781 generic.go:334] "Generic (PLEG): container finished" podID="04a6248c-9bb7-4204-a19a-1041d4d06f3e" containerID="5bf9a893223f67d64bf47b7d269e9bfc4cff08274f9960786165bceabb963cd0" exitCode=0 Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.763453 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" event={"ID":"04a6248c-9bb7-4204-a19a-1041d4d06f3e","Type":"ContainerDied","Data":"5bf9a893223f67d64bf47b7d269e9bfc4cff08274f9960786165bceabb963cd0"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.765743 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gltzh" event={"ID":"36103c10-f353-4b13-8c7a-eeee5e3b4f44","Type":"ContainerStarted","Data":"c746d00b8962bd399dc43f73f07039d035ce1c3ca2325317d44d4d8373306885"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.765970 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-gltzh" Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.767348 4781 patch_prober.go:28] interesting pod/console-operator-58897d9998-gltzh container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.767401 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gltzh" podUID="36103c10-f353-4b13-8c7a-eeee5e3b4f44" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.768173 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nm2f4" event={"ID":"50d4d4c8-66e1-4d10-85cb-0fee6079d5fe","Type":"ContainerStarted","Data":"f65d6ba37ffd0316761d6625dbb2be9a5dc4b94c2fc248ebe314887dba163418"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.769724 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lt8cm" event={"ID":"79f1ffd5-bf7e-4303-81c6-72cd48e9ae46","Type":"ContainerStarted","Data":"45a9feb35fccfbf4224b90b24525ff3fae67f0da7ab7c11f2acffeba7a597304"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.771113 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" event={"ID":"2a4f5362-7618-4d15-8be9-465e83bdb0b9","Type":"ContainerStarted","Data":"34806d104c62bc4e7e86e63c8d01a64f6fb0423a1535cfe23662bf6c1772f5fd"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.772239 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vvqhq" event={"ID":"c3b35e49-827a-4445-b46f-ca74f65450c5","Type":"ContainerStarted","Data":"0f9bea75e673ca6310bea37e30308a32d1c3f09a1a9c8eb51c27a5fff1a1e598"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.773308 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lhfvc" event={"ID":"822c4f3b-992d-4ac5-aeb5-cc70cce2b305","Type":"ContainerStarted","Data":"a9e4e1f131c3365affa8f58181ae15106bffa35920b7ebd5b3efb4f2236f2491"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.775393 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f" event={"ID":"b9509647-5e74-4452-a998-f0f699160a70","Type":"ContainerStarted","Data":"dfb3debbe1167b1b35ae42acf79189f6106b8f78d8692e1a869c9d49887792ea"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.777021 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dpqgc" event={"ID":"83b6450c-e497-4d31-9c9a-e36fbe675a2e","Type":"ContainerStarted","Data":"1998a8fab42bcdfdab818defb49b48aec0d5bbe1841bff62376069bf0b0e3ae3"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.778211 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4d2fb" event={"ID":"8a173fa2-3cb7-4f70-a04f-50a3131cb1ca","Type":"ContainerStarted","Data":"70053aac09576f2e3e03bdd98b6ef0f222b9daa951421150978d5f1b1ef869cb"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.779500 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbdsz" event={"ID":"dc3224e4-715f-4f5d-99a0-5325d9b87fb8","Type":"ContainerStarted","Data":"b1b5647102999724e94b1c1e3206aabe054d9729ee9634fba36663166eb4cfa6"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.780560 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" event={"ID":"70c23dd9-b165-4f35-9230-da18a16f48be","Type":"ContainerStarted","Data":"ce3239c217f4a3357f252e1204730b8c33889a9b2c8970a6d82b0804b93be9de"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.781765 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvc5b" event={"ID":"69ddef2d-4afd-4668-b95f-29137b133855","Type":"ContainerStarted","Data":"daee0b3fcf3000bff56471975b5fbe5d51f7033d818548362cf85b6a3b102878"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.782852 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wvgpb" event={"ID":"2f1a331e-d7ef-4ab6-82e3-a5579353a37d","Type":"ContainerStarted","Data":"5191868e3e010fcd6a77cd3751169efc10113ee384462b703e5fff990c0be733"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.784154 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7" event={"ID":"51a8404c-874a-43e4-917a-604110280bd7","Type":"ContainerStarted","Data":"1976ef7ad9b8c56bf176286962bf159d3b372dace4003fa538010a387046dab7"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.784175 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:01 crc kubenswrapper[4781]: E1202 09:23:01.784229 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:02.284215453 +0000 UTC m=+145.108089332 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.784660 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:01 crc kubenswrapper[4781]: E1202 09:23:01.785033 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:02.285024146 +0000 UTC m=+145.108898025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.785390 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" event={"ID":"9c00ba40-b027-444f-aa39-f8f0cbbd13cd","Type":"ContainerStarted","Data":"bfdbd59e67c82197f41abb2c799423c29b6006a7fd722ffda49e7046ff3ac7fa"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.786623 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-r42hh" event={"ID":"ee42da42-30b7-40ef-b00c-4d61e25502e0","Type":"ContainerStarted","Data":"115c7e24ba72df39fe4525a8f454c781f4833d26de83cc09e6063b8721620efe"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.788156 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" event={"ID":"d9c52f13-f9c6-419e-8f69-ee91e29f4629","Type":"ContainerStarted","Data":"9d58ef2fb6b4d88eb31b97dab2c2a5ee91face9162406c746a2b5a3e8bee2a11"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.789341 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" event={"ID":"b7602882-7ecf-4ce4-9cfa-3a691b3f9270","Type":"ContainerStarted","Data":"0485e7643026ac1eff407e1fb336969ac13eb338614c1badd2cf6f018962ece0"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.790714 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" event={"ID":"cef8c4dd-6e0f-44c9-8926-72b0d821f823","Type":"ContainerStarted","Data":"4dfee10dbde5c137a7daf25b50dba3c4683df2fbb5072195fc1215648984567e"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.794360 4781 generic.go:334] "Generic (PLEG): container finished" podID="11d24c84-34b9-46a0-9d24-65ca291b4ac6" containerID="01eef15b02c0d9cc13e7badb698beb6947b7f5e55429f2a7a84eeeadb7918ab9" exitCode=0 Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.794462 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" event={"ID":"11d24c84-34b9-46a0-9d24-65ca291b4ac6","Type":"ContainerDied","Data":"01eef15b02c0d9cc13e7badb698beb6947b7f5e55429f2a7a84eeeadb7918ab9"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.796273 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" event={"ID":"71324519-4199-4389-87b9-705916c2c5c4","Type":"ContainerStarted","Data":"3f7e6ce42d76a12887c3820ae808e762acca18ac3a13f3811cf72b1c7f8d75d3"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.798535 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" event={"ID":"c0570ff6-6102-40ae-a68a-b35b77756097","Type":"ContainerStarted","Data":"ae8fdd4dda91d7f4e0c02456eaeaa45e134256c14e623879549c9fd7227f38e2"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.798736 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.800403 4781 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-8nxjr container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.800443 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" podUID="c0570ff6-6102-40ae-a68a-b35b77756097" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.801128 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q" event={"ID":"6e5395e3-0578-419d-8297-bdcf2cc6ae01","Type":"ContainerStarted","Data":"6e6c5ddafd7b5fd985395f52e6d388ac429d81cc45c502747637831d6168f513"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.802496 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" event={"ID":"4297bd58-caf6-4962-b775-7f454787fa91","Type":"ContainerStarted","Data":"7f69e33611d273b03b3a67cee2000c7947df603600fd959b892c32e21bdcd19b"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.802692 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.803721 4781 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6wrt7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.803796 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" podUID="4297bd58-caf6-4962-b775-7f454787fa91" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.804153 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zmcr4" event={"ID":"d478d486-3666-4aae-967f-36d60f4de521","Type":"ContainerStarted","Data":"2dd62111e1591b9174efac1bbbc3d774ea36197aa7e9d011ce906f4c1189fee4"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.807548 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p8ljp" event={"ID":"be701e11-18c0-4541-9685-48bd65c661f2","Type":"ContainerStarted","Data":"a0f68c1c9bd2813afb4461762b9982886dee5f91227fae8a7cff8bd3c3d4acd2"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.819512 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" event={"ID":"2be1529f-3b01-4174-b029-7312871f5b97","Type":"ContainerStarted","Data":"7b71c0945ab9a655dd3c4bd20c949888c381ff7c91c43ee5673af236b0846e66"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.821995 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" event={"ID":"ff4d4965-cf17-4d57-8820-fec4107ba62d","Type":"ContainerStarted","Data":"2fc7de2d0f4c56604a59b5d7431952ed27b6020249a03dc6d9eeeecac75e1d06"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.824442 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl" event={"ID":"caa8272b-7968-4b09-b3a7-4aa6d4f12f2b","Type":"ContainerStarted","Data":"28dc8cb6bdeb091626ff82bd3bbb4d5ae89c08a4c348e72e07880ad58ecf18f4"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.825967 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n6m8q" event={"ID":"5ca4125e-8294-4a1c-9685-23022db2999b","Type":"ContainerStarted","Data":"48d5714284ba6c86cae31c5033e44e969226991c9b00cb73002abe13955f9d59"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.827018 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8k8z6" event={"ID":"5ba8d87f-de60-4d9c-8f9a-c6a38cd73858","Type":"ContainerStarted","Data":"02bb51f4d53d53676f18a58f4dc91e941c96370f36112b797048370519d008a9"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.829175 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-tr4x7" podStartSLOduration=125.829156082 podStartE2EDuration="2m5.829156082s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:01.819007508 +0000 UTC m=+144.642881387" watchObservedRunningTime="2025-12-02 09:23:01.829156082 +0000 UTC m=+144.653029961" Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.830075 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" event={"ID":"a635da40-fbc4-4225-a755-dcab98f66a76","Type":"ContainerStarted","Data":"dcee86e473d08260aeb2bfaff4c35204ef40e283e6fa8f299277cf82dce08d2f"} Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.830682 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-gpq68" Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.832436 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-gpq68 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.832499 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gpq68" podUID="b43f11b4-7580-463e-9543-2d38de346fe8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.846707 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68pb7" podStartSLOduration=125.846690214 podStartE2EDuration="2m5.846690214s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:01.846299593 +0000 UTC m=+144.670173482" watchObservedRunningTime="2025-12-02 09:23:01.846690214 +0000 UTC m=+144.670564093" Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.886232 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:01 crc kubenswrapper[4781]: E1202 09:23:01.886401 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:02.386375625 +0000 UTC m=+145.210249504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.886655 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:01 crc kubenswrapper[4781]: E1202 09:23:01.888210 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:02.388196197 +0000 UTC m=+145.212070146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.898442 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-gltzh" podStartSLOduration=125.898425543 podStartE2EDuration="2m5.898425543s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:01.875170061 +0000 UTC m=+144.699043940" watchObservedRunningTime="2025-12-02 09:23:01.898425543 +0000 UTC m=+144.722299412" Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.899876 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" podStartSLOduration=125.899871753 podStartE2EDuration="2m5.899871753s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:01.897490046 +0000 UTC m=+144.721363925" watchObservedRunningTime="2025-12-02 09:23:01.899871753 +0000 UTC m=+144.723745632" Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.927047 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-gpq68" podStartSLOduration=125.927030194 podStartE2EDuration="2m5.927030194s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:01.925756688 +0000 UTC m=+144.749630567" watchObservedRunningTime="2025-12-02 09:23:01.927030194 +0000 UTC m=+144.750904073" Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.928369 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.928433 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.966569 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" podStartSLOduration=125.966551541 podStartE2EDuration="2m5.966551541s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:01.964576326 +0000 UTC m=+144.788450215" watchObservedRunningTime="2025-12-02 09:23:01.966551541 +0000 UTC m=+144.790425420" Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.988397 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:01 crc kubenswrapper[4781]: E1202 09:23:01.988606 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:02.488575658 +0000 UTC m=+145.312449537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:01 crc kubenswrapper[4781]: I1202 09:23:01.989170 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:01 crc kubenswrapper[4781]: E1202 09:23:01.989641 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:02.489622517 +0000 UTC m=+145.313496396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.090060 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:02 crc kubenswrapper[4781]: E1202 09:23:02.090604 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:02.590584295 +0000 UTC m=+145.414458174 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.191504 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:02 crc kubenswrapper[4781]: E1202 09:23:02.191858 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:02.691846551 +0000 UTC m=+145.515720430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.293047 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:02 crc kubenswrapper[4781]: E1202 09:23:02.293198 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:02.793178091 +0000 UTC m=+145.617051970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.293703 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:02 crc kubenswrapper[4781]: E1202 09:23:02.294023 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:02.794009293 +0000 UTC m=+145.617883172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.394945 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:02 crc kubenswrapper[4781]: E1202 09:23:02.395186 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:02.895163337 +0000 UTC m=+145.719037206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.497162 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:02 crc kubenswrapper[4781]: E1202 09:23:02.497501 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:02.997489044 +0000 UTC m=+145.821362923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.598761 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:02 crc kubenswrapper[4781]: E1202 09:23:02.598992 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:03.098963486 +0000 UTC m=+145.922837375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.599285 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:02 crc kubenswrapper[4781]: E1202 09:23:02.599740 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:03.099720107 +0000 UTC m=+145.923593996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.700384 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:02 crc kubenswrapper[4781]: E1202 09:23:02.700594 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:03.200563952 +0000 UTC m=+146.024437821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.802184 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:02 crc kubenswrapper[4781]: E1202 09:23:02.802547 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:03.302532539 +0000 UTC m=+146.126406418 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.835099 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-gpq68 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.835133 4781 patch_prober.go:28] interesting pod/console-operator-58897d9998-gltzh container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.835158 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gpq68" podUID="b43f11b4-7580-463e-9543-2d38de346fe8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.835176 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gltzh" podUID="36103c10-f353-4b13-8c7a-eeee5e3b4f44" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.854609 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-vvqhq" podStartSLOduration=11.854587127 podStartE2EDuration="11.854587127s" podCreationTimestamp="2025-12-02 09:22:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:02.85291661 +0000 UTC m=+145.676790499" watchObservedRunningTime="2025-12-02 09:23:02.854587127 +0000 UTC m=+145.678461016" Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.875290 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.904313 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:02 crc kubenswrapper[4781]: E1202 09:23:02.904460 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:03.404435562 +0000 UTC m=+146.228309441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.905048 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:02 crc kubenswrapper[4781]: E1202 09:23:02.905404 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:03.405388819 +0000 UTC m=+146.229262708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.917812 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" podStartSLOduration=126.917789067 podStartE2EDuration="2m6.917789067s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:02.889069232 +0000 UTC m=+145.712943111" watchObservedRunningTime="2025-12-02 09:23:02.917789067 +0000 UTC m=+145.741662956" Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.923286 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.932157 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:02 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:02 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:02 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:02 crc kubenswrapper[4781]: I1202 09:23:02.932221 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.008453 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:03 crc kubenswrapper[4781]: E1202 09:23:03.009817 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:03.509792714 +0000 UTC m=+146.333666593 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.110581 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:03 crc kubenswrapper[4781]: E1202 09:23:03.111072 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:03.611056621 +0000 UTC m=+146.434930500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.211698 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:03 crc kubenswrapper[4781]: E1202 09:23:03.216850 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:03.716826023 +0000 UTC m=+146.540699902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.224917 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:03 crc kubenswrapper[4781]: E1202 09:23:03.225267 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:03.725252349 +0000 UTC m=+146.549126228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.329375 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:03 crc kubenswrapper[4781]: E1202 09:23:03.330183 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:03.830156458 +0000 UTC m=+146.654030337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.433603 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:03 crc kubenswrapper[4781]: E1202 09:23:03.434006 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:03.933993157 +0000 UTC m=+146.757867036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.542406 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:03 crc kubenswrapper[4781]: E1202 09:23:03.542509 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:04.042494386 +0000 UTC m=+146.866368265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.542761 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.542795 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.543743 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.543779 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.543816 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:03 crc kubenswrapper[4781]: E1202 09:23:03.544063 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:04.044055329 +0000 UTC m=+146.867929198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.550013 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.562041 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.562653 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.565621 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.610479 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.646390 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:03 crc kubenswrapper[4781]: E1202 09:23:03.646564 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:04.1465402 +0000 UTC m=+146.970414079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.646860 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:03 crc kubenswrapper[4781]: E1202 09:23:03.647187 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:04.147179368 +0000 UTC m=+146.971053247 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.748517 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:03 crc kubenswrapper[4781]: E1202 09:23:03.749017 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:04.248998651 +0000 UTC m=+147.072872530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.814626 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.822268 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.843572 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ntjsd"] Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.846237 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ntjsd" Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.849822 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:03 crc kubenswrapper[4781]: E1202 09:23:03.850201 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:04.350187525 +0000 UTC m=+147.174061404 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.852877 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.877087 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ntjsd"] Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.946624 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:03 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:03 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:03 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.946678 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.954641 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lhfvc" event={"ID":"822c4f3b-992d-4ac5-aeb5-cc70cce2b305","Type":"ContainerStarted","Data":"f6625c6cb1beb7c1e5738535da7cd99cd0bd8bc17c2ef7d33659d2baf3c825ff"} Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.955137 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.955760 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9170fc18-624b-4358-b931-7e889eee7317-utilities\") pod \"community-operators-ntjsd\" (UID: \"9170fc18-624b-4358-b931-7e889eee7317\") " pod="openshift-marketplace/community-operators-ntjsd" Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.955824 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9170fc18-624b-4358-b931-7e889eee7317-catalog-content\") pod \"community-operators-ntjsd\" (UID: \"9170fc18-624b-4358-b931-7e889eee7317\") " pod="openshift-marketplace/community-operators-ntjsd" Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.955880 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz7qg\" (UniqueName: \"kubernetes.io/projected/9170fc18-624b-4358-b931-7e889eee7317-kube-api-access-nz7qg\") pod \"community-operators-ntjsd\" (UID: \"9170fc18-624b-4358-b931-7e889eee7317\") " pod="openshift-marketplace/community-operators-ntjsd" Dec 02 09:23:03 crc kubenswrapper[4781]: E1202 09:23:03.956022 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:04.456004899 +0000 UTC m=+147.279878778 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:03 crc kubenswrapper[4781]: I1202 09:23:03.995899 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zmcr4" event={"ID":"d478d486-3666-4aae-967f-36d60f4de521","Type":"ContainerStarted","Data":"264e417c54aaebbe319272fd940d8b3e040a9e01c4ec0c25d7b1900af90ab04c"} Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.066618 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-gpq68 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.066674 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-gpq68" podUID="b43f11b4-7580-463e-9543-2d38de346fe8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.067394 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-gpq68 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.067431 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gpq68" podUID="b43f11b4-7580-463e-9543-2d38de346fe8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.068136 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9170fc18-624b-4358-b931-7e889eee7317-catalog-content\") pod \"community-operators-ntjsd\" (UID: \"9170fc18-624b-4358-b931-7e889eee7317\") " pod="openshift-marketplace/community-operators-ntjsd" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.068197 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.068226 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz7qg\" (UniqueName: \"kubernetes.io/projected/9170fc18-624b-4358-b931-7e889eee7317-kube-api-access-nz7qg\") pod \"community-operators-ntjsd\" (UID: \"9170fc18-624b-4358-b931-7e889eee7317\") " pod="openshift-marketplace/community-operators-ntjsd" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.068262 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9170fc18-624b-4358-b931-7e889eee7317-utilities\") pod \"community-operators-ntjsd\" (UID: \"9170fc18-624b-4358-b931-7e889eee7317\") " pod="openshift-marketplace/community-operators-ntjsd" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.068946 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9170fc18-624b-4358-b931-7e889eee7317-utilities\") pod \"community-operators-ntjsd\" (UID: \"9170fc18-624b-4358-b931-7e889eee7317\") " pod="openshift-marketplace/community-operators-ntjsd" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.070408 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9170fc18-624b-4358-b931-7e889eee7317-catalog-content\") pod \"community-operators-ntjsd\" (UID: \"9170fc18-624b-4358-b931-7e889eee7317\") " pod="openshift-marketplace/community-operators-ntjsd" Dec 02 09:23:04 crc kubenswrapper[4781]: E1202 09:23:04.070904 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:04.570892657 +0000 UTC m=+147.394766536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.077858 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" event={"ID":"04a6248c-9bb7-4204-a19a-1041d4d06f3e","Type":"ContainerStarted","Data":"52dad4b1b9be564b530aa003d60ffefb0b9b3108e764aaf43b192c31f6d274e6"} Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.127369 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz7qg\" (UniqueName: \"kubernetes.io/projected/9170fc18-624b-4358-b931-7e889eee7317-kube-api-access-nz7qg\") pod \"community-operators-ntjsd\" (UID: \"9170fc18-624b-4358-b931-7e889eee7317\") " pod="openshift-marketplace/community-operators-ntjsd" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.147290 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" event={"ID":"9c00ba40-b027-444f-aa39-f8f0cbbd13cd","Type":"ContainerStarted","Data":"18ff9aee502d6fb646f985a121c56b3a1c7eac1a949c07dfa9d24ff1d2d962a3"} Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.171282 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:04 crc kubenswrapper[4781]: E1202 09:23:04.172278 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:04.672258677 +0000 UTC m=+147.496132556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.173584 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4d2fb" event={"ID":"8a173fa2-3cb7-4f70-a04f-50a3131cb1ca","Type":"ContainerStarted","Data":"e72109e60317935c5627a65195606b1c6dc7699e58c5bd1707ab2a435ee177f6"} Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.214207 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ntjsd" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.214941 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvc5b" event={"ID":"69ddef2d-4afd-4668-b95f-29137b133855","Type":"ContainerStarted","Data":"6edb296f84fcf906d4ce0a1004de6df5f9bb8d584475fec5a90770e99f32ac96"} Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.215232 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jnbdf" podStartSLOduration=128.21522093 podStartE2EDuration="2m8.21522093s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:04.213266915 +0000 UTC m=+147.037140794" watchObservedRunningTime="2025-12-02 09:23:04.21522093 +0000 UTC m=+147.039094829" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.216229 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-zmcr4" podStartSLOduration=128.216221057 podStartE2EDuration="2m8.216221057s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:04.077932664 +0000 UTC m=+146.901806543" watchObservedRunningTime="2025-12-02 09:23:04.216221057 +0000 UTC m=+147.040094936" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.218120 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.218149 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.228716 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5f85z"] Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.229993 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5f85z" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.236367 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5f85z"] Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.247472 4781 patch_prober.go:28] interesting pod/console-f9d7485db-krhf5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.247516 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-krhf5" podUID="f87709bf-590d-4318-abdb-7ecb8a86e303" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.270615 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4d2fb" podStartSLOduration=128.270599701 podStartE2EDuration="2m8.270599701s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:04.269904922 +0000 UTC m=+147.093778801" watchObservedRunningTime="2025-12-02 09:23:04.270599701 +0000 UTC m=+147.094473580" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.272304 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-catalog-content\") pod \"community-operators-5f85z\" (UID: \"69ab05d8-7b1c-4314-aa8c-57d38ae2e885\") " pod="openshift-marketplace/community-operators-5f85z" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.272350 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.272422 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pqjd\" (UniqueName: \"kubernetes.io/projected/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-kube-api-access-5pqjd\") pod \"community-operators-5f85z\" (UID: \"69ab05d8-7b1c-4314-aa8c-57d38ae2e885\") " pod="openshift-marketplace/community-operators-5f85z" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.311393 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-utilities\") pod \"community-operators-5f85z\" (UID: \"69ab05d8-7b1c-4314-aa8c-57d38ae2e885\") " pod="openshift-marketplace/community-operators-5f85z" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.327694 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbdsz" event={"ID":"dc3224e4-715f-4f5d-99a0-5325d9b87fb8","Type":"ContainerStarted","Data":"6933d9e5d1a938a355631f283f6ac576ddacd23bb6097e29648a2b643de3fe4e"} Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.328654 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbdsz" Dec 02 09:23:04 crc kubenswrapper[4781]: E1202 09:23:04.338703 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:04.838681458 +0000 UTC m=+147.662555347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.348798 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvc5b" podStartSLOduration=128.348775251 podStartE2EDuration="2m8.348775251s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:04.339771338 +0000 UTC m=+147.163645237" watchObservedRunningTime="2025-12-02 09:23:04.348775251 +0000 UTC m=+147.172649130" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.352900 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" event={"ID":"d20e2eeb-837d-4299-b1ce-2044781226f0","Type":"ContainerStarted","Data":"4f61c83f1def9914ce9669e0c3b152af6cf1fd4f059819870da0f8993ef296c3"} Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.353900 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.358408 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.359990 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.364387 4781 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-x4lq4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.364477 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" podUID="cef8c4dd-6e0f-44c9-8926-72b0d821f823" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.412633 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.413460 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-catalog-content\") pod \"community-operators-5f85z\" (UID: \"69ab05d8-7b1c-4314-aa8c-57d38ae2e885\") " pod="openshift-marketplace/community-operators-5f85z" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.413866 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pqjd\" (UniqueName: \"kubernetes.io/projected/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-kube-api-access-5pqjd\") pod \"community-operators-5f85z\" (UID: \"69ab05d8-7b1c-4314-aa8c-57d38ae2e885\") " pod="openshift-marketplace/community-operators-5f85z" Dec 02 09:23:04 crc kubenswrapper[4781]: E1202 09:23:04.415463 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:04.915441149 +0000 UTC m=+147.739315028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.423814 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-catalog-content\") pod \"community-operators-5f85z\" (UID: \"69ab05d8-7b1c-4314-aa8c-57d38ae2e885\") " pod="openshift-marketplace/community-operators-5f85z" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.441471 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-utilities\") pod \"community-operators-5f85z\" (UID: \"69ab05d8-7b1c-4314-aa8c-57d38ae2e885\") " pod="openshift-marketplace/community-operators-5f85z" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.452889 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-844r4"] Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.475628 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-utilities\") pod \"community-operators-5f85z\" (UID: \"69ab05d8-7b1c-4314-aa8c-57d38ae2e885\") " pod="openshift-marketplace/community-operators-5f85z" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.514176 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pqjd\" (UniqueName: \"kubernetes.io/projected/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-kube-api-access-5pqjd\") pod \"community-operators-5f85z\" (UID: \"69ab05d8-7b1c-4314-aa8c-57d38ae2e885\") " pod="openshift-marketplace/community-operators-5f85z" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.515551 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-844r4" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.529462 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-844r4"] Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.537531 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.543758 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t6t99" podStartSLOduration=128.543736102 podStartE2EDuration="2m8.543736102s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:04.467478506 +0000 UTC m=+147.291352375" watchObservedRunningTime="2025-12-02 09:23:04.543736102 +0000 UTC m=+147.367609981" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.554611 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" podStartSLOduration=128.554595836 podStartE2EDuration="2m8.554595836s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:04.548424173 +0000 UTC m=+147.372298052" watchObservedRunningTime="2025-12-02 09:23:04.554595836 +0000 UTC m=+147.378469715" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.560593 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:04 crc kubenswrapper[4781]: E1202 09:23:04.560945 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:05.060910763 +0000 UTC m=+147.884784642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.564973 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.609881 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p8ljp" podStartSLOduration=128.609858634 podStartE2EDuration="2m8.609858634s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:04.6072234 +0000 UTC m=+147.431097279" watchObservedRunningTime="2025-12-02 09:23:04.609858634 +0000 UTC m=+147.433732513" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.648379 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t59n7"] Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.655686 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t59n7" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.661667 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.661886 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87j5w\" (UniqueName: \"kubernetes.io/projected/b28a19d5-e49e-46f0-942d-dc9f96777c2d-kube-api-access-87j5w\") pod \"certified-operators-844r4\" (UID: \"b28a19d5-e49e-46f0-942d-dc9f96777c2d\") " pod="openshift-marketplace/certified-operators-844r4" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.661988 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b28a19d5-e49e-46f0-942d-dc9f96777c2d-catalog-content\") pod \"certified-operators-844r4\" (UID: \"b28a19d5-e49e-46f0-942d-dc9f96777c2d\") " pod="openshift-marketplace/certified-operators-844r4" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.662023 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b28a19d5-e49e-46f0-942d-dc9f96777c2d-utilities\") pod \"certified-operators-844r4\" (UID: \"b28a19d5-e49e-46f0-942d-dc9f96777c2d\") " pod="openshift-marketplace/certified-operators-844r4" Dec 02 09:23:04 crc kubenswrapper[4781]: E1202 09:23:04.662227 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:05.162208281 +0000 UTC m=+147.986082160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.663173 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t59n7"] Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.688550 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.723186 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5f85z" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.764671 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45gcq\" (UniqueName: \"kubernetes.io/projected/0a0d43df-4480-4b62-bd3d-d129fbdcd722-kube-api-access-45gcq\") pod \"certified-operators-t59n7\" (UID: \"0a0d43df-4480-4b62-bd3d-d129fbdcd722\") " pod="openshift-marketplace/certified-operators-t59n7" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.764718 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b28a19d5-e49e-46f0-942d-dc9f96777c2d-catalog-content\") pod \"certified-operators-844r4\" (UID: \"b28a19d5-e49e-46f0-942d-dc9f96777c2d\") " pod="openshift-marketplace/certified-operators-844r4" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.764759 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b28a19d5-e49e-46f0-942d-dc9f96777c2d-utilities\") pod \"certified-operators-844r4\" (UID: \"b28a19d5-e49e-46f0-942d-dc9f96777c2d\") " pod="openshift-marketplace/certified-operators-844r4" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.764785 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a0d43df-4480-4b62-bd3d-d129fbdcd722-utilities\") pod \"certified-operators-t59n7\" (UID: \"0a0d43df-4480-4b62-bd3d-d129fbdcd722\") " pod="openshift-marketplace/certified-operators-t59n7" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.764864 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.764908 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87j5w\" (UniqueName: \"kubernetes.io/projected/b28a19d5-e49e-46f0-942d-dc9f96777c2d-kube-api-access-87j5w\") pod \"certified-operators-844r4\" (UID: \"b28a19d5-e49e-46f0-942d-dc9f96777c2d\") " pod="openshift-marketplace/certified-operators-844r4" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.764950 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a0d43df-4480-4b62-bd3d-d129fbdcd722-catalog-content\") pod \"certified-operators-t59n7\" (UID: \"0a0d43df-4480-4b62-bd3d-d129fbdcd722\") " pod="openshift-marketplace/certified-operators-t59n7" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.766070 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b28a19d5-e49e-46f0-942d-dc9f96777c2d-catalog-content\") pod \"certified-operators-844r4\" (UID: \"b28a19d5-e49e-46f0-942d-dc9f96777c2d\") " pod="openshift-marketplace/certified-operators-844r4" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.766214 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b28a19d5-e49e-46f0-942d-dc9f96777c2d-utilities\") pod \"certified-operators-844r4\" (UID: \"b28a19d5-e49e-46f0-942d-dc9f96777c2d\") " pod="openshift-marketplace/certified-operators-844r4" Dec 02 09:23:04 crc kubenswrapper[4781]: E1202 09:23:04.766356 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:05.266341758 +0000 UTC m=+148.090215737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.770486 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" podStartSLOduration=128.762741017 podStartE2EDuration="2m8.762741017s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:04.698734514 +0000 UTC m=+147.522608403" watchObservedRunningTime="2025-12-02 09:23:04.762741017 +0000 UTC m=+147.586614896" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.770698 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n6m8q" podStartSLOduration=128.770679589 podStartE2EDuration="2m8.770679589s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:04.760328469 +0000 UTC m=+147.584202348" watchObservedRunningTime="2025-12-02 09:23:04.770679589 +0000 UTC m=+147.594553478" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.819335 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-gltzh" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.845823 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87j5w\" (UniqueName: \"kubernetes.io/projected/b28a19d5-e49e-46f0-942d-dc9f96777c2d-kube-api-access-87j5w\") pod \"certified-operators-844r4\" (UID: \"b28a19d5-e49e-46f0-942d-dc9f96777c2d\") " pod="openshift-marketplace/certified-operators-844r4" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.856911 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wvgpb" podStartSLOduration=13.856890644 podStartE2EDuration="13.856890644s" podCreationTimestamp="2025-12-02 09:22:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:04.817758988 +0000 UTC m=+147.641632867" watchObservedRunningTime="2025-12-02 09:23:04.856890644 +0000 UTC m=+147.680764523" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.867338 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.867746 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a0d43df-4480-4b62-bd3d-d129fbdcd722-catalog-content\") pod \"certified-operators-t59n7\" (UID: \"0a0d43df-4480-4b62-bd3d-d129fbdcd722\") " pod="openshift-marketplace/certified-operators-t59n7" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.867899 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45gcq\" (UniqueName: \"kubernetes.io/projected/0a0d43df-4480-4b62-bd3d-d129fbdcd722-kube-api-access-45gcq\") pod \"certified-operators-t59n7\" (UID: \"0a0d43df-4480-4b62-bd3d-d129fbdcd722\") " pod="openshift-marketplace/certified-operators-t59n7" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.868035 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a0d43df-4480-4b62-bd3d-d129fbdcd722-utilities\") pod \"certified-operators-t59n7\" (UID: \"0a0d43df-4480-4b62-bd3d-d129fbdcd722\") " pod="openshift-marketplace/certified-operators-t59n7" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.870817 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a0d43df-4480-4b62-bd3d-d129fbdcd722-catalog-content\") pod \"certified-operators-t59n7\" (UID: \"0a0d43df-4480-4b62-bd3d-d129fbdcd722\") " pod="openshift-marketplace/certified-operators-t59n7" Dec 02 09:23:04 crc kubenswrapper[4781]: E1202 09:23:04.870904 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:05.370882166 +0000 UTC m=+148.194756045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.871954 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a0d43df-4480-4b62-bd3d-d129fbdcd722-utilities\") pod \"certified-operators-t59n7\" (UID: \"0a0d43df-4480-4b62-bd3d-d129fbdcd722\") " pod="openshift-marketplace/certified-operators-t59n7" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.916629 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45gcq\" (UniqueName: \"kubernetes.io/projected/0a0d43df-4480-4b62-bd3d-d129fbdcd722-kube-api-access-45gcq\") pod \"certified-operators-t59n7\" (UID: \"0a0d43df-4480-4b62-bd3d-d129fbdcd722\") " pod="openshift-marketplace/certified-operators-t59n7" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.917286 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.927335 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.930489 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbdsz" podStartSLOduration=128.930470875 podStartE2EDuration="2m8.930470875s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:04.900890397 +0000 UTC m=+147.724764296" watchObservedRunningTime="2025-12-02 09:23:04.930470875 +0000 UTC m=+147.754344764" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.933687 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.951256 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:04 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:04 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:04 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.951308 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.975765 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:04 crc kubenswrapper[4781]: E1202 09:23:04.976092 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:05.476081353 +0000 UTC m=+148.299955222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:04 crc kubenswrapper[4781]: I1202 09:23:04.982770 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" podStartSLOduration=128.98275137 podStartE2EDuration="2m8.98275137s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:04.979577841 +0000 UTC m=+147.803451720" watchObservedRunningTime="2025-12-02 09:23:04.98275137 +0000 UTC m=+147.806625259" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.005611 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-844r4" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.032043 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t59n7" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.040545 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-n9cmc" podStartSLOduration=129.040529438 podStartE2EDuration="2m9.040529438s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:05.038889383 +0000 UTC m=+147.862763262" watchObservedRunningTime="2025-12-02 09:23:05.040529438 +0000 UTC m=+147.864403317" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.080129 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:05 crc kubenswrapper[4781]: E1202 09:23:05.080644 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:05.580621552 +0000 UTC m=+148.404495431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.080744 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:05 crc kubenswrapper[4781]: E1202 09:23:05.081495 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:05.581483605 +0000 UTC m=+148.405357484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.080157 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7n69p" podStartSLOduration=129.080136177 podStartE2EDuration="2m9.080136177s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:05.072579876 +0000 UTC m=+147.896453755" watchObservedRunningTime="2025-12-02 09:23:05.080136177 +0000 UTC m=+147.904010056" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.145686 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjx4q" podStartSLOduration=129.145660103 podStartE2EDuration="2m9.145660103s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:05.13340906 +0000 UTC m=+147.957282939" watchObservedRunningTime="2025-12-02 09:23:05.145660103 +0000 UTC m=+147.969533982" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.193805 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:05 crc kubenswrapper[4781]: E1202 09:23:05.194545 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:05.694528102 +0000 UTC m=+148.518401981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.229664 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-sjskm" podStartSLOduration=129.229643616 podStartE2EDuration="2m9.229643616s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:05.21874493 +0000 UTC m=+148.042618809" watchObservedRunningTime="2025-12-02 09:23:05.229643616 +0000 UTC m=+148.053517505" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.243780 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.266554 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dpqgc" podStartSLOduration=129.266534109 podStartE2EDuration="2m9.266534109s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:05.266130628 +0000 UTC m=+148.090004527" watchObservedRunningTime="2025-12-02 09:23:05.266534109 +0000 UTC m=+148.090407988" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.297018 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:05 crc kubenswrapper[4781]: E1202 09:23:05.297407 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:05.797392373 +0000 UTC m=+148.621266252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.308886 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nm2f4" podStartSLOduration=129.308870235 podStartE2EDuration="2m9.308870235s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:05.308239727 +0000 UTC m=+148.132113606" watchObservedRunningTime="2025-12-02 09:23:05.308870235 +0000 UTC m=+148.132744114" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.366046 4781 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zlpfg container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.366471 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" podUID="ff4d4965-cf17-4d57-8820-fec4107ba62d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.388044 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" podStartSLOduration=129.388026952 podStartE2EDuration="2m9.388026952s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:05.341644683 +0000 UTC m=+148.165518562" watchObservedRunningTime="2025-12-02 09:23:05.388026952 +0000 UTC m=+148.211900831" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.389272 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4sb6f" podStartSLOduration=129.389268097 podStartE2EDuration="2m9.389268097s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:05.387134458 +0000 UTC m=+148.211008337" watchObservedRunningTime="2025-12-02 09:23:05.389268097 +0000 UTC m=+148.213141966" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.411103 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:05 crc kubenswrapper[4781]: E1202 09:23:05.411434 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:05.911420238 +0000 UTC m=+148.735294117 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.473944 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" event={"ID":"04a6248c-9bb7-4204-a19a-1041d4d06f3e","Type":"ContainerStarted","Data":"9ec37ec96d66eafaa4b6421a14ba613cfe7c9f0a047fab6eb3ef9e1501bd104f"} Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.475499 4781 generic.go:334] "Generic (PLEG): container finished" podID="70c23dd9-b165-4f35-9230-da18a16f48be" containerID="ce3239c217f4a3357f252e1204730b8c33889a9b2c8970a6d82b0804b93be9de" exitCode=0 Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.475594 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" event={"ID":"70c23dd9-b165-4f35-9230-da18a16f48be","Type":"ContainerDied","Data":"ce3239c217f4a3357f252e1204730b8c33889a9b2c8970a6d82b0804b93be9de"} Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.492129 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8k8z6" podStartSLOduration=129.492111117 podStartE2EDuration="2m9.492111117s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:05.447499229 +0000 UTC m=+148.271373108" watchObservedRunningTime="2025-12-02 09:23:05.492111117 +0000 UTC m=+148.315985006" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.507847 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6hrl" podStartSLOduration=129.507833149 podStartE2EDuration="2m9.507833149s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:05.49434645 +0000 UTC m=+148.318220329" watchObservedRunningTime="2025-12-02 09:23:05.507833149 +0000 UTC m=+148.331707028" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.521780 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:05 crc kubenswrapper[4781]: E1202 09:23:05.522171 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:06.02215702 +0000 UTC m=+148.846030899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.522357 4781 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-x4lq4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.522466 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" podUID="cef8c4dd-6e0f-44c9-8926-72b0d821f823" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.578415 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-lhfvc" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.578579 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-r42hh" event={"ID":"ee42da42-30b7-40ef-b00c-4d61e25502e0","Type":"ContainerStarted","Data":"536364180ef5aded5423c942657ff343f6f63297d0fcfaa6658be933774e74c6"} Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.578661 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b42fbdd866e0e647f43327326b2724b34e4f451df203d1907ffb1722426cb2a5"} Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.590095 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lt8cm" podStartSLOduration=129.590069861 podStartE2EDuration="2m9.590069861s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:05.578283302 +0000 UTC m=+148.402157181" watchObservedRunningTime="2025-12-02 09:23:05.590069861 +0000 UTC m=+148.413943740" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.623047 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:05 crc kubenswrapper[4781]: E1202 09:23:05.625199 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:06.125174716 +0000 UTC m=+148.949048605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.681964 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ntjsd"] Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.726114 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:05 crc kubenswrapper[4781]: E1202 09:23:05.727706 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:06.227694927 +0000 UTC m=+149.051568806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.729617 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" podStartSLOduration=129.72960434 podStartE2EDuration="2m9.72960434s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:05.724053815 +0000 UTC m=+148.547927684" watchObservedRunningTime="2025-12-02 09:23:05.72960434 +0000 UTC m=+148.553478219" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.777184 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5f85z"] Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.836158 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:05 crc kubenswrapper[4781]: E1202 09:23:05.836505 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:06.336490664 +0000 UTC m=+149.160364543 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.888422 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-r42hh" podStartSLOduration=129.888409119 podStartE2EDuration="2m9.888409119s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:05.885476156 +0000 UTC m=+148.709350035" watchObservedRunningTime="2025-12-02 09:23:05.888409119 +0000 UTC m=+148.712282998" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.947674 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:05 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:05 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:05 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.947724 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.974185 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lhfvc" podStartSLOduration=14.974138461 podStartE2EDuration="14.974138461s" podCreationTimestamp="2025-12-02 09:22:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:05.969105759 +0000 UTC m=+148.792979638" watchObservedRunningTime="2025-12-02 09:23:05.974138461 +0000 UTC m=+148.798012360" Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.979866 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:05 crc kubenswrapper[4781]: E1202 09:23:05.980414 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:06.480397725 +0000 UTC m=+149.304271604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:05 crc kubenswrapper[4781]: I1202 09:23:05.998792 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-844r4"] Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.019022 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8pgk8"] Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.020404 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pgk8" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.035806 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pgk8"] Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.046332 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.081061 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.081561 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4734fd7-42d6-4b87-9160-5a3471f91d03-utilities\") pod \"redhat-marketplace-8pgk8\" (UID: \"c4734fd7-42d6-4b87-9160-5a3471f91d03\") " pod="openshift-marketplace/redhat-marketplace-8pgk8" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.081661 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4734fd7-42d6-4b87-9160-5a3471f91d03-catalog-content\") pod \"redhat-marketplace-8pgk8\" (UID: \"c4734fd7-42d6-4b87-9160-5a3471f91d03\") " pod="openshift-marketplace/redhat-marketplace-8pgk8" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.081737 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpxf4\" (UniqueName: \"kubernetes.io/projected/c4734fd7-42d6-4b87-9160-5a3471f91d03-kube-api-access-cpxf4\") pod \"redhat-marketplace-8pgk8\" (UID: \"c4734fd7-42d6-4b87-9160-5a3471f91d03\") " pod="openshift-marketplace/redhat-marketplace-8pgk8" Dec 02 09:23:06 crc kubenswrapper[4781]: E1202 09:23:06.081881 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:06.581864808 +0000 UTC m=+149.405738687 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:06 crc kubenswrapper[4781]: W1202 09:23:06.114753 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb28a19d5_e49e_46f0_942d_dc9f96777c2d.slice/crio-8edb6c67ecc9174be59ae3ab68b4da1d959f1b977722c7bd0d6abbb15e91b6da WatchSource:0}: Error finding container 8edb6c67ecc9174be59ae3ab68b4da1d959f1b977722c7bd0d6abbb15e91b6da: Status 404 returned error can't find the container with id 8edb6c67ecc9174be59ae3ab68b4da1d959f1b977722c7bd0d6abbb15e91b6da Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.183774 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.183843 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4734fd7-42d6-4b87-9160-5a3471f91d03-utilities\") pod \"redhat-marketplace-8pgk8\" (UID: \"c4734fd7-42d6-4b87-9160-5a3471f91d03\") " pod="openshift-marketplace/redhat-marketplace-8pgk8" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.183885 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4734fd7-42d6-4b87-9160-5a3471f91d03-catalog-content\") pod \"redhat-marketplace-8pgk8\" (UID: \"c4734fd7-42d6-4b87-9160-5a3471f91d03\") " pod="openshift-marketplace/redhat-marketplace-8pgk8" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.183906 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpxf4\" (UniqueName: \"kubernetes.io/projected/c4734fd7-42d6-4b87-9160-5a3471f91d03-kube-api-access-cpxf4\") pod \"redhat-marketplace-8pgk8\" (UID: \"c4734fd7-42d6-4b87-9160-5a3471f91d03\") " pod="openshift-marketplace/redhat-marketplace-8pgk8" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.184621 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4734fd7-42d6-4b87-9160-5a3471f91d03-utilities\") pod \"redhat-marketplace-8pgk8\" (UID: \"c4734fd7-42d6-4b87-9160-5a3471f91d03\") " pod="openshift-marketplace/redhat-marketplace-8pgk8" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.184811 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4734fd7-42d6-4b87-9160-5a3471f91d03-catalog-content\") pod \"redhat-marketplace-8pgk8\" (UID: \"c4734fd7-42d6-4b87-9160-5a3471f91d03\") " pod="openshift-marketplace/redhat-marketplace-8pgk8" Dec 02 09:23:06 crc kubenswrapper[4781]: E1202 09:23:06.185149 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:06.68513032 +0000 UTC m=+149.509004269 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.224166 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpxf4\" (UniqueName: \"kubernetes.io/projected/c4734fd7-42d6-4b87-9160-5a3471f91d03-kube-api-access-cpxf4\") pod \"redhat-marketplace-8pgk8\" (UID: \"c4734fd7-42d6-4b87-9160-5a3471f91d03\") " pod="openshift-marketplace/redhat-marketplace-8pgk8" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.235835 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t59n7"] Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.274882 4781 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zlpfg container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.274963 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" podUID="ff4d4965-cf17-4d57-8820-fec4107ba62d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.284769 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:06 crc kubenswrapper[4781]: E1202 09:23:06.285131 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:06.785111721 +0000 UTC m=+149.608985600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.363031 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pgk8" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.366729 4781 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zlpfg container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.366798 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" podUID="ff4d4965-cf17-4d57-8820-fec4107ba62d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.387711 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:06 crc kubenswrapper[4781]: E1202 09:23:06.388127 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:06.888115026 +0000 UTC m=+149.711988905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.415781 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8d48v"] Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.417142 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8d48v" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.463341 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8d48v"] Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.488990 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.489133 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5610cf-1c3f-4010-a4ee-4820372400c1-catalog-content\") pod \"redhat-marketplace-8d48v\" (UID: \"5e5610cf-1c3f-4010-a4ee-4820372400c1\") " pod="openshift-marketplace/redhat-marketplace-8d48v" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.489161 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lmdl\" (UniqueName: \"kubernetes.io/projected/5e5610cf-1c3f-4010-a4ee-4820372400c1-kube-api-access-7lmdl\") pod \"redhat-marketplace-8d48v\" (UID: \"5e5610cf-1c3f-4010-a4ee-4820372400c1\") " pod="openshift-marketplace/redhat-marketplace-8d48v" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.489237 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5610cf-1c3f-4010-a4ee-4820372400c1-utilities\") pod \"redhat-marketplace-8d48v\" (UID: \"5e5610cf-1c3f-4010-a4ee-4820372400c1\") " pod="openshift-marketplace/redhat-marketplace-8d48v" Dec 02 09:23:06 crc kubenswrapper[4781]: E1202 09:23:06.489336 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:06.989321182 +0000 UTC m=+149.813195061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.573656 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t59n7" event={"ID":"0a0d43df-4480-4b62-bd3d-d129fbdcd722","Type":"ContainerStarted","Data":"55acb068bdb880988b70af6a22949feb297af148d64358c4ce1bc283e7a20d47"} Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.591956 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.592030 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5610cf-1c3f-4010-a4ee-4820372400c1-utilities\") pod \"redhat-marketplace-8d48v\" (UID: \"5e5610cf-1c3f-4010-a4ee-4820372400c1\") " pod="openshift-marketplace/redhat-marketplace-8d48v" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.592066 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5610cf-1c3f-4010-a4ee-4820372400c1-catalog-content\") pod \"redhat-marketplace-8d48v\" (UID: \"5e5610cf-1c3f-4010-a4ee-4820372400c1\") " pod="openshift-marketplace/redhat-marketplace-8d48v" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.592097 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lmdl\" (UniqueName: \"kubernetes.io/projected/5e5610cf-1c3f-4010-a4ee-4820372400c1-kube-api-access-7lmdl\") pod \"redhat-marketplace-8d48v\" (UID: \"5e5610cf-1c3f-4010-a4ee-4820372400c1\") " pod="openshift-marketplace/redhat-marketplace-8d48v" Dec 02 09:23:06 crc kubenswrapper[4781]: E1202 09:23:06.592681 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:07.092669017 +0000 UTC m=+149.916542896 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.593240 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5610cf-1c3f-4010-a4ee-4820372400c1-utilities\") pod \"redhat-marketplace-8d48v\" (UID: \"5e5610cf-1c3f-4010-a4ee-4820372400c1\") " pod="openshift-marketplace/redhat-marketplace-8d48v" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.593467 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5610cf-1c3f-4010-a4ee-4820372400c1-catalog-content\") pod \"redhat-marketplace-8d48v\" (UID: \"5e5610cf-1c3f-4010-a4ee-4820372400c1\") " pod="openshift-marketplace/redhat-marketplace-8d48v" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.601240 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5f85z" event={"ID":"69ab05d8-7b1c-4314-aa8c-57d38ae2e885","Type":"ContainerStarted","Data":"a505efafd52bfc5af7cc5405e132cd80711d2bc0e6fbe34a3c6597105c832243"} Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.621829 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntjsd" event={"ID":"9170fc18-624b-4358-b931-7e889eee7317","Type":"ContainerStarted","Data":"bf04ace9f66f9598a823a6b8ba25e7d4fe21d2190232688a005a8eafbe4613b7"} Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.632870 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"05bde53fa5b72045746e458521412d3b08047758e8ca735d08854ded4ab7bb2c"} Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.648887 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lmdl\" (UniqueName: \"kubernetes.io/projected/5e5610cf-1c3f-4010-a4ee-4820372400c1-kube-api-access-7lmdl\") pod \"redhat-marketplace-8d48v\" (UID: \"5e5610cf-1c3f-4010-a4ee-4820372400c1\") " pod="openshift-marketplace/redhat-marketplace-8d48v" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.655202 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2de957402c411d309490bcd12ae8878ab858b035966e7317941e0d385e9d6ca0"} Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.676978 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-844r4" event={"ID":"b28a19d5-e49e-46f0-942d-dc9f96777c2d","Type":"ContainerStarted","Data":"8edb6c67ecc9174be59ae3ab68b4da1d959f1b977722c7bd0d6abbb15e91b6da"} Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.696907 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:06 crc kubenswrapper[4781]: E1202 09:23:06.698131 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:07.19811185 +0000 UTC m=+150.021985739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.738197 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8d48v" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.802734 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:06 crc kubenswrapper[4781]: E1202 09:23:06.804180 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:07.304165411 +0000 UTC m=+150.128039300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.908168 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:06 crc kubenswrapper[4781]: E1202 09:23:06.908670 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:07.408650307 +0000 UTC m=+150.232524186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.923264 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.923842 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.927748 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.936661 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.943078 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:06 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:06 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:06 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.943119 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.945883 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 09:23:06 crc kubenswrapper[4781]: I1202 09:23:06.985058 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pgk8"] Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:06.998272 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zlpfg" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.009826 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aeb9d041-5dd8-4102-a1a9-ad98b865a9fa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"aeb9d041-5dd8-4102-a1a9-ad98b865a9fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.009881 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.009934 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aeb9d041-5dd8-4102-a1a9-ad98b865a9fa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"aeb9d041-5dd8-4102-a1a9-ad98b865a9fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 09:23:07 crc kubenswrapper[4781]: E1202 09:23:07.010247 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:07.510231713 +0000 UTC m=+150.334105592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:07 crc kubenswrapper[4781]: W1202 09:23:07.012093 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4734fd7_42d6_4b87_9160_5a3471f91d03.slice/crio-ce89cef24bcfcae7d6cfdd8e92e1aaff06cda7b76730583203906365f0fff65f WatchSource:0}: Error finding container ce89cef24bcfcae7d6cfdd8e92e1aaff06cda7b76730583203906365f0fff65f: Status 404 returned error can't find the container with id ce89cef24bcfcae7d6cfdd8e92e1aaff06cda7b76730583203906365f0fff65f Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.112110 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.113225 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.113601 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aeb9d041-5dd8-4102-a1a9-ad98b865a9fa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"aeb9d041-5dd8-4102-a1a9-ad98b865a9fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.113790 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aeb9d041-5dd8-4102-a1a9-ad98b865a9fa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"aeb9d041-5dd8-4102-a1a9-ad98b865a9fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.114144 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aeb9d041-5dd8-4102-a1a9-ad98b865a9fa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"aeb9d041-5dd8-4102-a1a9-ad98b865a9fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 09:23:07 crc kubenswrapper[4781]: E1202 09:23:07.114634 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:07.614611216 +0000 UTC m=+150.438485165 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.157411 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aeb9d041-5dd8-4102-a1a9-ad98b865a9fa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"aeb9d041-5dd8-4102-a1a9-ad98b865a9fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.220841 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:07 crc kubenswrapper[4781]: E1202 09:23:07.222486 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:07.722469398 +0000 UTC m=+150.546343277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.224435 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9bkjh"] Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.225375 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9bkjh" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.235815 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.250069 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9bkjh"] Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.250406 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.322288 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:07 crc kubenswrapper[4781]: E1202 09:23:07.322461 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:07.822434398 +0000 UTC m=+150.646308267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.322800 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-catalog-content\") pod \"redhat-operators-9bkjh\" (UID: \"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb\") " pod="openshift-marketplace/redhat-operators-9bkjh" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.322827 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7wbd\" (UniqueName: \"kubernetes.io/projected/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-kube-api-access-n7wbd\") pod \"redhat-operators-9bkjh\" (UID: \"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb\") " pod="openshift-marketplace/redhat-operators-9bkjh" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.322888 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.322915 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-utilities\") pod \"redhat-operators-9bkjh\" (UID: \"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb\") " pod="openshift-marketplace/redhat-operators-9bkjh" Dec 02 09:23:07 crc kubenswrapper[4781]: E1202 09:23:07.323409 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:07.823394045 +0000 UTC m=+150.647267924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.342914 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.423474 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pc7m\" (UniqueName: \"kubernetes.io/projected/70c23dd9-b165-4f35-9230-da18a16f48be-kube-api-access-4pc7m\") pod \"70c23dd9-b165-4f35-9230-da18a16f48be\" (UID: \"70c23dd9-b165-4f35-9230-da18a16f48be\") " Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.423546 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70c23dd9-b165-4f35-9230-da18a16f48be-config-volume\") pod \"70c23dd9-b165-4f35-9230-da18a16f48be\" (UID: \"70c23dd9-b165-4f35-9230-da18a16f48be\") " Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.423670 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.423704 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70c23dd9-b165-4f35-9230-da18a16f48be-secret-volume\") pod \"70c23dd9-b165-4f35-9230-da18a16f48be\" (UID: \"70c23dd9-b165-4f35-9230-da18a16f48be\") " Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.423898 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-catalog-content\") pod \"redhat-operators-9bkjh\" (UID: \"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb\") " pod="openshift-marketplace/redhat-operators-9bkjh" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.423969 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7wbd\" (UniqueName: \"kubernetes.io/projected/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-kube-api-access-n7wbd\") pod \"redhat-operators-9bkjh\" (UID: \"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb\") " pod="openshift-marketplace/redhat-operators-9bkjh" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.424021 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-utilities\") pod \"redhat-operators-9bkjh\" (UID: \"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb\") " pod="openshift-marketplace/redhat-operators-9bkjh" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.424436 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-utilities\") pod \"redhat-operators-9bkjh\" (UID: \"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb\") " pod="openshift-marketplace/redhat-operators-9bkjh" Dec 02 09:23:07 crc kubenswrapper[4781]: E1202 09:23:07.426539 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:07.926515284 +0000 UTC m=+150.750389153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.426718 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-catalog-content\") pod \"redhat-operators-9bkjh\" (UID: \"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb\") " pod="openshift-marketplace/redhat-operators-9bkjh" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.427133 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70c23dd9-b165-4f35-9230-da18a16f48be-config-volume" (OuterVolumeSpecName: "config-volume") pod "70c23dd9-b165-4f35-9230-da18a16f48be" (UID: "70c23dd9-b165-4f35-9230-da18a16f48be"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.429660 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c23dd9-b165-4f35-9230-da18a16f48be-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "70c23dd9-b165-4f35-9230-da18a16f48be" (UID: "70c23dd9-b165-4f35-9230-da18a16f48be"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.436483 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c23dd9-b165-4f35-9230-da18a16f48be-kube-api-access-4pc7m" (OuterVolumeSpecName: "kube-api-access-4pc7m") pod "70c23dd9-b165-4f35-9230-da18a16f48be" (UID: "70c23dd9-b165-4f35-9230-da18a16f48be"). InnerVolumeSpecName "kube-api-access-4pc7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.443325 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7wbd\" (UniqueName: \"kubernetes.io/projected/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-kube-api-access-n7wbd\") pod \"redhat-operators-9bkjh\" (UID: \"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb\") " pod="openshift-marketplace/redhat-operators-9bkjh" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.456001 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8d48v"] Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.524850 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.525229 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70c23dd9-b165-4f35-9230-da18a16f48be-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.525243 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70c23dd9-b165-4f35-9230-da18a16f48be-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.525258 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pc7m\" (UniqueName: \"kubernetes.io/projected/70c23dd9-b165-4f35-9230-da18a16f48be-kube-api-access-4pc7m\") on node \"crc\" DevicePath \"\"" Dec 02 09:23:07 crc kubenswrapper[4781]: E1202 09:23:07.525658 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:08.025644041 +0000 UTC m=+150.849517920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.574531 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9bkjh" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.601669 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.617092 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4rfsl"] Dec 02 09:23:07 crc kubenswrapper[4781]: E1202 09:23:07.617315 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c23dd9-b165-4f35-9230-da18a16f48be" containerName="collect-profiles" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.617328 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c23dd9-b165-4f35-9230-da18a16f48be" containerName="collect-profiles" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.617445 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c23dd9-b165-4f35-9230-da18a16f48be" containerName="collect-profiles" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.618148 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4rfsl" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.626579 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.626780 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkkgc\" (UniqueName: \"kubernetes.io/projected/f37a1838-a130-4cb4-807f-2214cbfefdbf-kube-api-access-lkkgc\") pod \"redhat-operators-4rfsl\" (UID: \"f37a1838-a130-4cb4-807f-2214cbfefdbf\") " pod="openshift-marketplace/redhat-operators-4rfsl" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.626871 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37a1838-a130-4cb4-807f-2214cbfefdbf-utilities\") pod \"redhat-operators-4rfsl\" (UID: \"f37a1838-a130-4cb4-807f-2214cbfefdbf\") " pod="openshift-marketplace/redhat-operators-4rfsl" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.626897 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37a1838-a130-4cb4-807f-2214cbfefdbf-catalog-content\") pod \"redhat-operators-4rfsl\" (UID: \"f37a1838-a130-4cb4-807f-2214cbfefdbf\") " pod="openshift-marketplace/redhat-operators-4rfsl" Dec 02 09:23:07 crc kubenswrapper[4781]: E1202 09:23:07.627006 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:08.126993089 +0000 UTC m=+150.950866968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.629962 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4rfsl"] Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.692230 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" event={"ID":"11d24c84-34b9-46a0-9d24-65ca291b4ac6","Type":"ContainerStarted","Data":"a3552dd660c3c1a7e29da9bcb07c6d550deb21ed1f95a958c68177a065fb0e09"} Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.694188 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f00c729fde4eb58b36910ecde6792a8b0f89f63ca891483337543e4526f1d1a0"} Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.697020 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pgk8" event={"ID":"c4734fd7-42d6-4b87-9160-5a3471f91d03","Type":"ContainerStarted","Data":"ce89cef24bcfcae7d6cfdd8e92e1aaff06cda7b76730583203906365f0fff65f"} Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.702331 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8d48v" event={"ID":"5e5610cf-1c3f-4010-a4ee-4820372400c1","Type":"ContainerStarted","Data":"45f718a8ef20054dc81ffccfb02f3a01180398d4c6ff5126953f53be8c08d080"} Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.706019 4781 generic.go:334] "Generic (PLEG): container finished" podID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" containerID="2a618bd42c0bddbc25e8bd8117042482cdec79b549874b67493a989aaad70345" exitCode=0 Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.706081 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-844r4" event={"ID":"b28a19d5-e49e-46f0-942d-dc9f96777c2d","Type":"ContainerDied","Data":"2a618bd42c0bddbc25e8bd8117042482cdec79b549874b67493a989aaad70345"} Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.707854 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"48c7c4c57562b2c7ae7008ff8b0b4ba44a391cd45ad5d381f5899c6ad8464f85"} Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.708504 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.711101 4781 generic.go:334] "Generic (PLEG): container finished" podID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" containerID="4b2f67bea61bb2389d0202658b9278243091d0e6578c1be014a78ce40b42cf4b" exitCode=0 Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.711142 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t59n7" event={"ID":"0a0d43df-4480-4b62-bd3d-d129fbdcd722","Type":"ContainerDied","Data":"4b2f67bea61bb2389d0202658b9278243091d0e6578c1be014a78ce40b42cf4b"} Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.712398 4781 generic.go:334] "Generic (PLEG): container finished" podID="69ab05d8-7b1c-4314-aa8c-57d38ae2e885" containerID="81263bcef9dd5f6862cde4a7b86658cbb713c144bde22195c7495383610ae853" exitCode=0 Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.712478 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5f85z" event={"ID":"69ab05d8-7b1c-4314-aa8c-57d38ae2e885","Type":"ContainerDied","Data":"81263bcef9dd5f6862cde4a7b86658cbb713c144bde22195c7495383610ae853"} Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.714147 4781 generic.go:334] "Generic (PLEG): container finished" podID="9170fc18-624b-4358-b931-7e889eee7317" containerID="4a80fef1afee3a1e76fb280ef9fc157055dec329cd6c0e90013685465a99f0e2" exitCode=0 Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.714202 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntjsd" event={"ID":"9170fc18-624b-4358-b931-7e889eee7317","Type":"ContainerDied","Data":"4a80fef1afee3a1e76fb280ef9fc157055dec329cd6c0e90013685465a99f0e2"} Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.715820 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8a139d9396f8ebf4c1f3f33f4cb766390365ad36207cfc48ac825bea57b5cff5"} Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.717552 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" event={"ID":"70c23dd9-b165-4f35-9230-da18a16f48be","Type":"ContainerDied","Data":"014e3df0363556fbd084f4379078397d1a82139dbb2d47f739c459be813cf33a"} Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.717589 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="014e3df0363556fbd084f4379078397d1a82139dbb2d47f739c459be813cf33a" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.717620 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.727880 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37a1838-a130-4cb4-807f-2214cbfefdbf-utilities\") pod \"redhat-operators-4rfsl\" (UID: \"f37a1838-a130-4cb4-807f-2214cbfefdbf\") " pod="openshift-marketplace/redhat-operators-4rfsl" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.727973 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37a1838-a130-4cb4-807f-2214cbfefdbf-catalog-content\") pod \"redhat-operators-4rfsl\" (UID: \"f37a1838-a130-4cb4-807f-2214cbfefdbf\") " pod="openshift-marketplace/redhat-operators-4rfsl" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.728078 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkkgc\" (UniqueName: \"kubernetes.io/projected/f37a1838-a130-4cb4-807f-2214cbfefdbf-kube-api-access-lkkgc\") pod \"redhat-operators-4rfsl\" (UID: \"f37a1838-a130-4cb4-807f-2214cbfefdbf\") " pod="openshift-marketplace/redhat-operators-4rfsl" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.728182 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.728443 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37a1838-a130-4cb4-807f-2214cbfefdbf-utilities\") pod \"redhat-operators-4rfsl\" (UID: \"f37a1838-a130-4cb4-807f-2214cbfefdbf\") " pod="openshift-marketplace/redhat-operators-4rfsl" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.729141 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37a1838-a130-4cb4-807f-2214cbfefdbf-catalog-content\") pod \"redhat-operators-4rfsl\" (UID: \"f37a1838-a130-4cb4-807f-2214cbfefdbf\") " pod="openshift-marketplace/redhat-operators-4rfsl" Dec 02 09:23:07 crc kubenswrapper[4781]: E1202 09:23:07.729246 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:08.229234203 +0000 UTC m=+151.053108082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.752544 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkkgc\" (UniqueName: \"kubernetes.io/projected/f37a1838-a130-4cb4-807f-2214cbfefdbf-kube-api-access-lkkgc\") pod \"redhat-operators-4rfsl\" (UID: \"f37a1838-a130-4cb4-807f-2214cbfefdbf\") " pod="openshift-marketplace/redhat-operators-4rfsl" Dec 02 09:23:07 crc kubenswrapper[4781]: W1202 09:23:07.764245 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaeb9d041_5dd8_4102_a1a9_ad98b865a9fa.slice/crio-294da590f04266781c9652b36f0a49a706f0f14547b9ecb3d4ff3e1214244120 WatchSource:0}: Error finding container 294da590f04266781c9652b36f0a49a706f0f14547b9ecb3d4ff3e1214244120: Status 404 returned error can't find the container with id 294da590f04266781c9652b36f0a49a706f0f14547b9ecb3d4ff3e1214244120 Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.783831 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" podStartSLOduration=131.783807222 podStartE2EDuration="2m11.783807222s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:07.777173217 +0000 UTC m=+150.601047096" watchObservedRunningTime="2025-12-02 09:23:07.783807222 +0000 UTC m=+150.607681101" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.791825 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4rfsl" Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.829772 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:07 crc kubenswrapper[4781]: E1202 09:23:07.829898 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:08.329876793 +0000 UTC m=+151.153750672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.830267 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:07 crc kubenswrapper[4781]: E1202 09:23:07.830592 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:08.330580823 +0000 UTC m=+151.154454702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.924567 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9bkjh"] Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.932479 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:07 crc kubenswrapper[4781]: E1202 09:23:07.932663 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:08.432642482 +0000 UTC m=+151.256516361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.932724 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:07 crc kubenswrapper[4781]: E1202 09:23:07.933079 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:08.433068083 +0000 UTC m=+151.256941962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.934953 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:07 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:07 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:07 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:07 crc kubenswrapper[4781]: I1202 09:23:07.935005 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.033825 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:08 crc kubenswrapper[4781]: E1202 09:23:08.034117 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:08.534073793 +0000 UTC m=+151.357947672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.034210 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:08 crc kubenswrapper[4781]: E1202 09:23:08.034582 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:08.534562346 +0000 UTC m=+151.358436245 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:08 crc kubenswrapper[4781]: W1202 09:23:08.047184 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bbd8e20_1afd_4e4a_a689_e77ae2042bfb.slice/crio-1f9d5157e08eb9b77ba0efda8e2238ec5fa609460eca9cb4d943ec1fcfebad58 WatchSource:0}: Error finding container 1f9d5157e08eb9b77ba0efda8e2238ec5fa609460eca9cb4d943ec1fcfebad58: Status 404 returned error can't find the container with id 1f9d5157e08eb9b77ba0efda8e2238ec5fa609460eca9cb4d943ec1fcfebad58 Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.135764 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:08 crc kubenswrapper[4781]: E1202 09:23:08.136338 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:08.636322847 +0000 UTC m=+151.460196726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.136426 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:08 crc kubenswrapper[4781]: E1202 09:23:08.136779 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:08.636767749 +0000 UTC m=+151.460641628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.237852 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:08 crc kubenswrapper[4781]: E1202 09:23:08.238086 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:08.738060757 +0000 UTC m=+151.561934626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.332178 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4rfsl"] Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.340211 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:08 crc kubenswrapper[4781]: E1202 09:23:08.340613 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:08.840593829 +0000 UTC m=+151.664467768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:08 crc kubenswrapper[4781]: W1202 09:23:08.342212 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf37a1838_a130_4cb4_807f_2214cbfefdbf.slice/crio-8f7cb44349f736d9c10f68dfbea0a4e4d6cdb229a06a65775aa9b950ac5c75a6 WatchSource:0}: Error finding container 8f7cb44349f736d9c10f68dfbea0a4e4d6cdb229a06a65775aa9b950ac5c75a6: Status 404 returned error can't find the container with id 8f7cb44349f736d9c10f68dfbea0a4e4d6cdb229a06a65775aa9b950ac5c75a6 Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.440955 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:08 crc kubenswrapper[4781]: E1202 09:23:08.441187 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:08.941153376 +0000 UTC m=+151.765027255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.441260 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:08 crc kubenswrapper[4781]: E1202 09:23:08.441564 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:08.941551387 +0000 UTC m=+151.765425266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.542864 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:08 crc kubenswrapper[4781]: E1202 09:23:08.543104 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:09.043078161 +0000 UTC m=+151.866952050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.543411 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:08 crc kubenswrapper[4781]: E1202 09:23:08.543739 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:09.043725089 +0000 UTC m=+151.867598968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.644594 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:08 crc kubenswrapper[4781]: E1202 09:23:08.644891 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:09.144862262 +0000 UTC m=+151.968736141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.723774 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bkjh" event={"ID":"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb","Type":"ContainerStarted","Data":"1f9d5157e08eb9b77ba0efda8e2238ec5fa609460eca9cb4d943ec1fcfebad58"} Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.724814 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"aeb9d041-5dd8-4102-a1a9-ad98b865a9fa","Type":"ContainerStarted","Data":"294da590f04266781c9652b36f0a49a706f0f14547b9ecb3d4ff3e1214244120"} Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.725772 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rfsl" event={"ID":"f37a1838-a130-4cb4-807f-2214cbfefdbf","Type":"ContainerStarted","Data":"8f7cb44349f736d9c10f68dfbea0a4e4d6cdb229a06a65775aa9b950ac5c75a6"} Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.727370 4781 generic.go:334] "Generic (PLEG): container finished" podID="c4734fd7-42d6-4b87-9160-5a3471f91d03" containerID="f6f9a042e9592cdf08cd9d8005e2ff9e5ebfad0b53ecab4fb805644810bde95b" exitCode=0 Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.727425 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pgk8" event={"ID":"c4734fd7-42d6-4b87-9160-5a3471f91d03","Type":"ContainerDied","Data":"f6f9a042e9592cdf08cd9d8005e2ff9e5ebfad0b53ecab4fb805644810bde95b"} Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.729466 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.729997 4781 generic.go:334] "Generic (PLEG): container finished" podID="5e5610cf-1c3f-4010-a4ee-4820372400c1" containerID="47f1a0b6ce04c530d53529205258ac8dcb8c1a913fe37964cd4a0728da949624" exitCode=0 Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.730144 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8d48v" event={"ID":"5e5610cf-1c3f-4010-a4ee-4820372400c1","Type":"ContainerDied","Data":"47f1a0b6ce04c530d53529205258ac8dcb8c1a913fe37964cd4a0728da949624"} Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.748181 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:08 crc kubenswrapper[4781]: E1202 09:23:08.749214 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:09.249191004 +0000 UTC m=+152.073064883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.817795 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.818725 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.821191 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.822896 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.834862 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.850386 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:08 crc kubenswrapper[4781]: E1202 09:23:08.850802 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:09.35078231 +0000 UTC m=+152.174656189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.942506 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:08 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:08 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:08 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.942568 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.951750 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.951798 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7eaf28df-78d8-4254-b124-19bd51e11c54-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7eaf28df-78d8-4254-b124-19bd51e11c54\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 09:23:08 crc kubenswrapper[4781]: I1202 09:23:08.951820 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7eaf28df-78d8-4254-b124-19bd51e11c54-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7eaf28df-78d8-4254-b124-19bd51e11c54\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 09:23:08 crc kubenswrapper[4781]: E1202 09:23:08.952139 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:09.452128449 +0000 UTC m=+152.276002328 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.053031 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.053666 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:09.553431156 +0000 UTC m=+152.377305065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.053735 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7eaf28df-78d8-4254-b124-19bd51e11c54-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7eaf28df-78d8-4254-b124-19bd51e11c54\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.054115 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.054307 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7eaf28df-78d8-4254-b124-19bd51e11c54-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7eaf28df-78d8-4254-b124-19bd51e11c54\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.054609 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7eaf28df-78d8-4254-b124-19bd51e11c54-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7eaf28df-78d8-4254-b124-19bd51e11c54\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.054939 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:09.554906268 +0000 UTC m=+152.378780147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.077172 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7eaf28df-78d8-4254-b124-19bd51e11c54-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7eaf28df-78d8-4254-b124-19bd51e11c54\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.137577 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.155696 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.155810 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:09.655789274 +0000 UTC m=+152.479663153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.156378 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.156682 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:09.656671979 +0000 UTC m=+152.480545858 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.257663 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.257861 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:09.757842472 +0000 UTC m=+152.581716361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.273487 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.273545 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.274729 4781 patch_prober.go:28] interesting pod/apiserver-76f77b778f-9k9bd container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.33:8443/livez\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.274777 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" podUID="04a6248c-9bb7-4204-a19a-1041d4d06f3e" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.33:8443/livez\": dial tcp 10.217.0.33:8443: connect: connection refused" Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.328992 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.359730 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.360347 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:09.860333993 +0000 UTC m=+152.684207872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.460823 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.460999 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:09.960974712 +0000 UTC m=+152.784848591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.461059 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.461457 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:09.961444495 +0000 UTC m=+152.785318374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.561972 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.562172 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.062144626 +0000 UTC m=+152.886018515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.562619 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.563019 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.063007231 +0000 UTC m=+152.886881120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.663600 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.663766 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.163737342 +0000 UTC m=+152.987611231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.663846 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.664238 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.164224636 +0000 UTC m=+152.988098525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.685973 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.686639 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.740325 4781 generic.go:334] "Generic (PLEG): container finished" podID="f37a1838-a130-4cb4-807f-2214cbfefdbf" containerID="1d8977f6fa5964fd9ed6bab3633602200974f6a082e5197d046c86e9771a0e9b" exitCode=0 Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.740373 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rfsl" event={"ID":"f37a1838-a130-4cb4-807f-2214cbfefdbf","Type":"ContainerDied","Data":"1d8977f6fa5964fd9ed6bab3633602200974f6a082e5197d046c86e9771a0e9b"} Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.742824 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" event={"ID":"d20e2eeb-837d-4299-b1ce-2044781226f0","Type":"ContainerStarted","Data":"bf664ef0ddc39a89479ee1e038392afa2226d23189af6adac98ed8e23a47c1a3"} Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.745756 4781 generic.go:334] "Generic (PLEG): container finished" podID="2bbd8e20-1afd-4e4a-a689-e77ae2042bfb" containerID="804feecb036d22d9ae44fac571d5a85343565569ffd73c8a73effa532bf707f6" exitCode=0 Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.745841 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bkjh" event={"ID":"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb","Type":"ContainerDied","Data":"804feecb036d22d9ae44fac571d5a85343565569ffd73c8a73effa532bf707f6"} Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.748909 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7eaf28df-78d8-4254-b124-19bd51e11c54","Type":"ContainerStarted","Data":"2060eac3549c1777bab21f679dc1328e36ecb68adbf057fdba0aed0cc4e7be64"} Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.752458 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"aeb9d041-5dd8-4102-a1a9-ad98b865a9fa","Type":"ContainerStarted","Data":"487dceb8853f0b7c535c911c44292424a5d2d471539d12543775347248232275"} Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.769555 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.769698 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.26968027 +0000 UTC m=+153.093554159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.769846 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.772357 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.272343464 +0000 UTC m=+153.096217363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.794699 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.79467857 podStartE2EDuration="3.79467857s" podCreationTimestamp="2025-12-02 09:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:09.77932908 +0000 UTC m=+152.603202959" watchObservedRunningTime="2025-12-02 09:23:09.79467857 +0000 UTC m=+152.618552449" Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.871309 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.871505 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.371479281 +0000 UTC m=+153.195353160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.871564 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.871912 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.371905364 +0000 UTC m=+153.195779243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.930478 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:09 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:09 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:09 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.930537 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.972887 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.973121 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.473100008 +0000 UTC m=+153.296973887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:09 crc kubenswrapper[4781]: I1202 09:23:09.973566 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:09 crc kubenswrapper[4781]: E1202 09:23:09.974269 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.47424147 +0000 UTC m=+153.298115349 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.075442 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:10 crc kubenswrapper[4781]: E1202 09:23:10.075639 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.575611579 +0000 UTC m=+153.399485458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.075894 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:10 crc kubenswrapper[4781]: E1202 09:23:10.076226 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.576215027 +0000 UTC m=+153.400088906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.176885 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:10 crc kubenswrapper[4781]: E1202 09:23:10.177402 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.67738169 +0000 UTC m=+153.501255569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.187344 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.249822 4781 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.278220 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:10 crc kubenswrapper[4781]: E1202 09:23:10.278581 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.778562794 +0000 UTC m=+153.602436673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.379596 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:10 crc kubenswrapper[4781]: E1202 09:23:10.379682 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.879666526 +0000 UTC m=+153.703540405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.379781 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:10 crc kubenswrapper[4781]: E1202 09:23:10.380103 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.880093059 +0000 UTC m=+153.703966938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.480868 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:10 crc kubenswrapper[4781]: E1202 09:23:10.481020 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.980996385 +0000 UTC m=+153.804870264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.481213 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:10 crc kubenswrapper[4781]: E1202 09:23:10.481498 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:10.981489709 +0000 UTC m=+153.805363588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.582794 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:10 crc kubenswrapper[4781]: E1202 09:23:10.582983 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:11.082961561 +0000 UTC m=+153.906835450 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.583490 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:10 crc kubenswrapper[4781]: E1202 09:23:10.583963 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:11.083941778 +0000 UTC m=+153.907815677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.685201 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:10 crc kubenswrapper[4781]: E1202 09:23:10.685374 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:11.18534858 +0000 UTC m=+154.009222459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.685495 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:10 crc kubenswrapper[4781]: E1202 09:23:10.685807 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:11.185799112 +0000 UTC m=+154.009672991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.763722 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" event={"ID":"d20e2eeb-837d-4299-b1ce-2044781226f0","Type":"ContainerStarted","Data":"e5f438e44e4c81b08fc0c400e12142a1ff1194017c221b3bd066a7e9ef439b39"} Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.766905 4781 generic.go:334] "Generic (PLEG): container finished" podID="aeb9d041-5dd8-4102-a1a9-ad98b865a9fa" containerID="487dceb8853f0b7c535c911c44292424a5d2d471539d12543775347248232275" exitCode=0 Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.766970 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"aeb9d041-5dd8-4102-a1a9-ad98b865a9fa","Type":"ContainerDied","Data":"487dceb8853f0b7c535c911c44292424a5d2d471539d12543775347248232275"} Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.773314 4781 generic.go:334] "Generic (PLEG): container finished" podID="7eaf28df-78d8-4254-b124-19bd51e11c54" containerID="6a04884befe0ca18647febc2a590754d4879ee3de8696b71e9c641d2ea243607" exitCode=0 Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.773409 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7eaf28df-78d8-4254-b124-19bd51e11c54","Type":"ContainerDied","Data":"6a04884befe0ca18647febc2a590754d4879ee3de8696b71e9c641d2ea243607"} Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.781256 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cfwrp" Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.789039 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:10 crc kubenswrapper[4781]: E1202 09:23:10.789171 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 09:23:11.289145147 +0000 UTC m=+154.113019026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.789452 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:10 crc kubenswrapper[4781]: E1202 09:23:10.789832 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 09:23:11.289816775 +0000 UTC m=+154.113690654 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s95hx" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.880181 4781 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-02T09:23:10.249857691Z","Handler":null,"Name":""} Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.887888 4781 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.887952 4781 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.890513 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.899477 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.929686 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:10 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:10 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:10 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.929747 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:10 crc kubenswrapper[4781]: I1202 09:23:10.991828 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:11 crc kubenswrapper[4781]: I1202 09:23:11.009287 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 09:23:11 crc kubenswrapper[4781]: I1202 09:23:11.009355 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:11 crc kubenswrapper[4781]: I1202 09:23:11.089449 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s95hx\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:11 crc kubenswrapper[4781]: I1202 09:23:11.126465 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:11 crc kubenswrapper[4781]: I1202 09:23:11.327171 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s95hx"] Dec 02 09:23:11 crc kubenswrapper[4781]: W1202 09:23:11.347056 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda84b89e_515f_4595_badf_a13b1ce0342a.slice/crio-1804438e85599d0f35e265acaf4154791225608413d166e49e1e7fc042354ffb WatchSource:0}: Error finding container 1804438e85599d0f35e265acaf4154791225608413d166e49e1e7fc042354ffb: Status 404 returned error can't find the container with id 1804438e85599d0f35e265acaf4154791225608413d166e49e1e7fc042354ffb Dec 02 09:23:11 crc kubenswrapper[4781]: I1202 09:23:11.523429 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 02 09:23:11 crc kubenswrapper[4781]: I1202 09:23:11.782616 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" event={"ID":"da84b89e-515f-4595-badf-a13b1ce0342a","Type":"ContainerStarted","Data":"d6f328b5a6c1718f5bab6bf30e2e97c2d89f117bd4bd787c424b55bb82f11bab"} Dec 02 09:23:11 crc kubenswrapper[4781]: I1202 09:23:11.782667 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" event={"ID":"da84b89e-515f-4595-badf-a13b1ce0342a","Type":"ContainerStarted","Data":"1804438e85599d0f35e265acaf4154791225608413d166e49e1e7fc042354ffb"} Dec 02 09:23:11 crc kubenswrapper[4781]: I1202 09:23:11.783662 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:11 crc kubenswrapper[4781]: I1202 09:23:11.791483 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" event={"ID":"d20e2eeb-837d-4299-b1ce-2044781226f0","Type":"ContainerStarted","Data":"59184adf3da57f4ea6b06a0b568ace2dadfea4398cfe4b5648c54d31a2806f53"} Dec 02 09:23:11 crc kubenswrapper[4781]: I1202 09:23:11.810991 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" podStartSLOduration=135.810975239 podStartE2EDuration="2m15.810975239s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:11.808444429 +0000 UTC m=+154.632318318" watchObservedRunningTime="2025-12-02 09:23:11.810975239 +0000 UTC m=+154.634849118" Dec 02 09:23:11 crc kubenswrapper[4781]: I1202 09:23:11.928863 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:11 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:11 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:11 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:11 crc kubenswrapper[4781]: I1202 09:23:11.928944 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.248104 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.268491 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-6c7fp" podStartSLOduration=21.268471675 podStartE2EDuration="21.268471675s" podCreationTimestamp="2025-12-02 09:22:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:23:11.844142319 +0000 UTC m=+154.668016198" watchObservedRunningTime="2025-12-02 09:23:12.268471675 +0000 UTC m=+155.092345554" Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.315500 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.318499 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aeb9d041-5dd8-4102-a1a9-ad98b865a9fa-kubelet-dir\") pod \"aeb9d041-5dd8-4102-a1a9-ad98b865a9fa\" (UID: \"aeb9d041-5dd8-4102-a1a9-ad98b865a9fa\") " Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.318554 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aeb9d041-5dd8-4102-a1a9-ad98b865a9fa-kube-api-access\") pod \"aeb9d041-5dd8-4102-a1a9-ad98b865a9fa\" (UID: \"aeb9d041-5dd8-4102-a1a9-ad98b865a9fa\") " Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.318781 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aeb9d041-5dd8-4102-a1a9-ad98b865a9fa-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "aeb9d041-5dd8-4102-a1a9-ad98b865a9fa" (UID: "aeb9d041-5dd8-4102-a1a9-ad98b865a9fa"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.319294 4781 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aeb9d041-5dd8-4102-a1a9-ad98b865a9fa-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.326909 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeb9d041-5dd8-4102-a1a9-ad98b865a9fa-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "aeb9d041-5dd8-4102-a1a9-ad98b865a9fa" (UID: "aeb9d041-5dd8-4102-a1a9-ad98b865a9fa"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.420368 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7eaf28df-78d8-4254-b124-19bd51e11c54-kube-api-access\") pod \"7eaf28df-78d8-4254-b124-19bd51e11c54\" (UID: \"7eaf28df-78d8-4254-b124-19bd51e11c54\") " Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.420473 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7eaf28df-78d8-4254-b124-19bd51e11c54-kubelet-dir\") pod \"7eaf28df-78d8-4254-b124-19bd51e11c54\" (UID: \"7eaf28df-78d8-4254-b124-19bd51e11c54\") " Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.420779 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7eaf28df-78d8-4254-b124-19bd51e11c54-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7eaf28df-78d8-4254-b124-19bd51e11c54" (UID: "7eaf28df-78d8-4254-b124-19bd51e11c54"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.421026 4781 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7eaf28df-78d8-4254-b124-19bd51e11c54-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.421050 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aeb9d041-5dd8-4102-a1a9-ad98b865a9fa-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.423777 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eaf28df-78d8-4254-b124-19bd51e11c54-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7eaf28df-78d8-4254-b124-19bd51e11c54" (UID: "7eaf28df-78d8-4254-b124-19bd51e11c54"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.522454 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7eaf28df-78d8-4254-b124-19bd51e11c54-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.818175 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.818202 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7eaf28df-78d8-4254-b124-19bd51e11c54","Type":"ContainerDied","Data":"2060eac3549c1777bab21f679dc1328e36ecb68adbf057fdba0aed0cc4e7be64"} Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.818264 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2060eac3549c1777bab21f679dc1328e36ecb68adbf057fdba0aed0cc4e7be64" Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.823579 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.824140 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"aeb9d041-5dd8-4102-a1a9-ad98b865a9fa","Type":"ContainerDied","Data":"294da590f04266781c9652b36f0a49a706f0f14547b9ecb3d4ff3e1214244120"} Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.824197 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="294da590f04266781c9652b36f0a49a706f0f14547b9ecb3d4ff3e1214244120" Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.929133 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:12 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:12 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:12 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:12 crc kubenswrapper[4781]: I1202 09:23:12.929191 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:13 crc kubenswrapper[4781]: I1202 09:23:13.303379 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lhfvc" Dec 02 09:23:13 crc kubenswrapper[4781]: I1202 09:23:13.928687 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:13 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:13 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:13 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:13 crc kubenswrapper[4781]: I1202 09:23:13.928784 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:14 crc kubenswrapper[4781]: I1202 09:23:14.056875 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-gpq68" Dec 02 09:23:14 crc kubenswrapper[4781]: I1202 09:23:14.197781 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:23:14 crc kubenswrapper[4781]: I1202 09:23:14.204462 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:23:14 crc kubenswrapper[4781]: I1202 09:23:14.219209 4781 patch_prober.go:28] interesting pod/console-f9d7485db-krhf5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Dec 02 09:23:14 crc kubenswrapper[4781]: I1202 09:23:14.219369 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-krhf5" podUID="f87709bf-590d-4318-abdb-7ecb8a86e303" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Dec 02 09:23:14 crc kubenswrapper[4781]: I1202 09:23:14.277735 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" Dec 02 09:23:14 crc kubenswrapper[4781]: I1202 09:23:14.289430 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:23:14 crc kubenswrapper[4781]: I1202 09:23:14.302539 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-9k9bd" Dec 02 09:23:14 crc kubenswrapper[4781]: I1202 09:23:14.942379 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:14 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:14 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:14 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:14 crc kubenswrapper[4781]: I1202 09:23:14.942694 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:15 crc kubenswrapper[4781]: I1202 09:23:15.928841 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:15 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:15 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:15 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:15 crc kubenswrapper[4781]: I1202 09:23:15.928899 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:16 crc kubenswrapper[4781]: I1202 09:23:16.935990 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:16 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:16 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:16 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:16 crc kubenswrapper[4781]: I1202 09:23:16.936157 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:17 crc kubenswrapper[4781]: I1202 09:23:17.929113 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:17 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:17 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:17 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:17 crc kubenswrapper[4781]: I1202 09:23:17.929434 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:18 crc kubenswrapper[4781]: I1202 09:23:18.415953 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs\") pod \"network-metrics-daemon-q792g\" (UID: \"bcdae8ff-3e82-4785-b958-a98717a14787\") " pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:23:18 crc kubenswrapper[4781]: I1202 09:23:18.422106 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcdae8ff-3e82-4785-b958-a98717a14787-metrics-certs\") pod \"network-metrics-daemon-q792g\" (UID: \"bcdae8ff-3e82-4785-b958-a98717a14787\") " pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:23:18 crc kubenswrapper[4781]: I1202 09:23:18.530172 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q792g" Dec 02 09:23:18 crc kubenswrapper[4781]: I1202 09:23:18.785077 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q792g"] Dec 02 09:23:18 crc kubenswrapper[4781]: W1202 09:23:18.791978 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcdae8ff_3e82_4785_b958_a98717a14787.slice/crio-8fc32fdaa1721c888b99b11d142d322a24f13195f2d864dc74003f3dfe92db0c WatchSource:0}: Error finding container 8fc32fdaa1721c888b99b11d142d322a24f13195f2d864dc74003f3dfe92db0c: Status 404 returned error can't find the container with id 8fc32fdaa1721c888b99b11d142d322a24f13195f2d864dc74003f3dfe92db0c Dec 02 09:23:18 crc kubenswrapper[4781]: I1202 09:23:18.882253 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q792g" event={"ID":"bcdae8ff-3e82-4785-b958-a98717a14787","Type":"ContainerStarted","Data":"8fc32fdaa1721c888b99b11d142d322a24f13195f2d864dc74003f3dfe92db0c"} Dec 02 09:23:18 crc kubenswrapper[4781]: I1202 09:23:18.929687 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:18 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:18 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:18 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:18 crc kubenswrapper[4781]: I1202 09:23:18.929740 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:19 crc kubenswrapper[4781]: I1202 09:23:19.928581 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:19 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:19 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:19 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:19 crc kubenswrapper[4781]: I1202 09:23:19.928642 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:20 crc kubenswrapper[4781]: I1202 09:23:20.929826 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:20 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:20 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:20 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:20 crc kubenswrapper[4781]: I1202 09:23:20.930224 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:21 crc kubenswrapper[4781]: I1202 09:23:21.931059 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:21 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:21 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:21 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:21 crc kubenswrapper[4781]: I1202 09:23:21.931115 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:22 crc kubenswrapper[4781]: I1202 09:23:22.929973 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:22 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:22 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:22 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:22 crc kubenswrapper[4781]: I1202 09:23:22.930019 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:23 crc kubenswrapper[4781]: I1202 09:23:23.929643 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:23 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 02 09:23:23 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:23 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:23 crc kubenswrapper[4781]: I1202 09:23:23.929732 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:24 crc kubenswrapper[4781]: I1202 09:23:24.218054 4781 patch_prober.go:28] interesting pod/console-f9d7485db-krhf5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Dec 02 09:23:24 crc kubenswrapper[4781]: I1202 09:23:24.218136 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-krhf5" podUID="f87709bf-590d-4318-abdb-7ecb8a86e303" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Dec 02 09:23:24 crc kubenswrapper[4781]: I1202 09:23:24.918584 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q792g" event={"ID":"bcdae8ff-3e82-4785-b958-a98717a14787","Type":"ContainerStarted","Data":"797d5f3b077c557273344bd69cb92e71bb29463f381c175bb29de8aaf35ae48b"} Dec 02 09:23:24 crc kubenswrapper[4781]: I1202 09:23:24.930259 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8ng22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 09:23:24 crc kubenswrapper[4781]: [+]has-synced ok Dec 02 09:23:24 crc kubenswrapper[4781]: [+]process-running ok Dec 02 09:23:24 crc kubenswrapper[4781]: healthz check failed Dec 02 09:23:24 crc kubenswrapper[4781]: I1202 09:23:24.930362 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8ng22" podUID="22ac0012-d7d1-4b53-a0e7-1ca1d09832b9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 09:23:25 crc kubenswrapper[4781]: I1202 09:23:25.929498 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:23:25 crc kubenswrapper[4781]: I1202 09:23:25.933046 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-8ng22" Dec 02 09:23:30 crc kubenswrapper[4781]: I1202 09:23:30.411710 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:23:30 crc kubenswrapper[4781]: I1202 09:23:30.412017 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:23:31 crc kubenswrapper[4781]: I1202 09:23:31.133391 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:23:34 crc kubenswrapper[4781]: I1202 09:23:34.221739 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:23:34 crc kubenswrapper[4781]: I1202 09:23:34.225373 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:23:35 crc kubenswrapper[4781]: I1202 09:23:35.198826 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbdsz" Dec 02 09:23:41 crc kubenswrapper[4781]: I1202 09:23:41.808417 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 09:23:41 crc kubenswrapper[4781]: E1202 09:23:41.809947 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb9d041-5dd8-4102-a1a9-ad98b865a9fa" containerName="pruner" Dec 02 09:23:41 crc kubenswrapper[4781]: I1202 09:23:41.809964 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb9d041-5dd8-4102-a1a9-ad98b865a9fa" containerName="pruner" Dec 02 09:23:41 crc kubenswrapper[4781]: E1202 09:23:41.809980 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eaf28df-78d8-4254-b124-19bd51e11c54" containerName="pruner" Dec 02 09:23:41 crc kubenswrapper[4781]: I1202 09:23:41.809988 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eaf28df-78d8-4254-b124-19bd51e11c54" containerName="pruner" Dec 02 09:23:41 crc kubenswrapper[4781]: I1202 09:23:41.810130 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb9d041-5dd8-4102-a1a9-ad98b865a9fa" containerName="pruner" Dec 02 09:23:41 crc kubenswrapper[4781]: I1202 09:23:41.810150 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eaf28df-78d8-4254-b124-19bd51e11c54" containerName="pruner" Dec 02 09:23:41 crc kubenswrapper[4781]: I1202 09:23:41.810557 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 09:23:41 crc kubenswrapper[4781]: I1202 09:23:41.816335 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 09:23:41 crc kubenswrapper[4781]: I1202 09:23:41.816553 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 02 09:23:41 crc kubenswrapper[4781]: I1202 09:23:41.816950 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 02 09:23:41 crc kubenswrapper[4781]: I1202 09:23:41.951587 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecddc0fd-4c27-4b71-9f6e-86c294013601-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ecddc0fd-4c27-4b71-9f6e-86c294013601\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 09:23:41 crc kubenswrapper[4781]: I1202 09:23:41.951871 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecddc0fd-4c27-4b71-9f6e-86c294013601-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ecddc0fd-4c27-4b71-9f6e-86c294013601\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 09:23:42 crc kubenswrapper[4781]: I1202 09:23:42.053140 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecddc0fd-4c27-4b71-9f6e-86c294013601-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ecddc0fd-4c27-4b71-9f6e-86c294013601\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 09:23:42 crc kubenswrapper[4781]: I1202 09:23:42.053223 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecddc0fd-4c27-4b71-9f6e-86c294013601-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ecddc0fd-4c27-4b71-9f6e-86c294013601\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 09:23:42 crc kubenswrapper[4781]: I1202 09:23:42.053301 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecddc0fd-4c27-4b71-9f6e-86c294013601-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ecddc0fd-4c27-4b71-9f6e-86c294013601\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 09:23:42 crc kubenswrapper[4781]: I1202 09:23:42.075792 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecddc0fd-4c27-4b71-9f6e-86c294013601-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ecddc0fd-4c27-4b71-9f6e-86c294013601\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 09:23:42 crc kubenswrapper[4781]: I1202 09:23:42.182640 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 09:23:43 crc kubenswrapper[4781]: I1202 09:23:43.706422 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 09:23:47 crc kubenswrapper[4781]: I1202 09:23:47.401756 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 09:23:47 crc kubenswrapper[4781]: I1202 09:23:47.403225 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 09:23:47 crc kubenswrapper[4781]: I1202 09:23:47.413668 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 09:23:47 crc kubenswrapper[4781]: I1202 09:23:47.523594 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5df92e64-6fc4-4395-a919-b1c46c30d318-kube-api-access\") pod \"installer-9-crc\" (UID: \"5df92e64-6fc4-4395-a919-b1c46c30d318\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 09:23:47 crc kubenswrapper[4781]: I1202 09:23:47.523659 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5df92e64-6fc4-4395-a919-b1c46c30d318-var-lock\") pod \"installer-9-crc\" (UID: \"5df92e64-6fc4-4395-a919-b1c46c30d318\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 09:23:47 crc kubenswrapper[4781]: I1202 09:23:47.523699 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5df92e64-6fc4-4395-a919-b1c46c30d318-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5df92e64-6fc4-4395-a919-b1c46c30d318\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 09:23:47 crc kubenswrapper[4781]: I1202 09:23:47.625225 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5df92e64-6fc4-4395-a919-b1c46c30d318-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5df92e64-6fc4-4395-a919-b1c46c30d318\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 09:23:47 crc kubenswrapper[4781]: I1202 09:23:47.625313 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5df92e64-6fc4-4395-a919-b1c46c30d318-kube-api-access\") pod \"installer-9-crc\" (UID: \"5df92e64-6fc4-4395-a919-b1c46c30d318\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 09:23:47 crc kubenswrapper[4781]: I1202 09:23:47.625343 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5df92e64-6fc4-4395-a919-b1c46c30d318-var-lock\") pod \"installer-9-crc\" (UID: \"5df92e64-6fc4-4395-a919-b1c46c30d318\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 09:23:47 crc kubenswrapper[4781]: I1202 09:23:47.625343 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5df92e64-6fc4-4395-a919-b1c46c30d318-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5df92e64-6fc4-4395-a919-b1c46c30d318\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 09:23:47 crc kubenswrapper[4781]: I1202 09:23:47.625401 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5df92e64-6fc4-4395-a919-b1c46c30d318-var-lock\") pod \"installer-9-crc\" (UID: \"5df92e64-6fc4-4395-a919-b1c46c30d318\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 09:23:47 crc kubenswrapper[4781]: I1202 09:23:47.642601 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5df92e64-6fc4-4395-a919-b1c46c30d318-kube-api-access\") pod \"installer-9-crc\" (UID: \"5df92e64-6fc4-4395-a919-b1c46c30d318\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 09:23:47 crc kubenswrapper[4781]: I1202 09:23:47.768474 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 09:24:00 crc kubenswrapper[4781]: I1202 09:24:00.412462 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:24:00 crc kubenswrapper[4781]: I1202 09:24:00.412854 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:24:00 crc kubenswrapper[4781]: I1202 09:24:00.412968 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:24:00 crc kubenswrapper[4781]: I1202 09:24:00.413860 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f"} pod="openshift-machine-config-operator/machine-config-daemon-pzntm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:24:00 crc kubenswrapper[4781]: I1202 09:24:00.414071 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" containerID="cri-o://5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f" gracePeriod=600 Dec 02 09:24:07 crc kubenswrapper[4781]: I1202 09:24:07.160386 4781 generic.go:334] "Generic (PLEG): container finished" podID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerID="5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f" exitCode=0 Dec 02 09:24:07 crc kubenswrapper[4781]: I1202 09:24:07.160470 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerDied","Data":"5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f"} Dec 02 09:24:31 crc kubenswrapper[4781]: E1202 09:24:31.483781 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 02 09:24:31 crc kubenswrapper[4781]: E1202 09:24:31.484452 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nz7qg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ntjsd_openshift-marketplace(9170fc18-624b-4358-b931-7e889eee7317): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 09:24:31 crc kubenswrapper[4781]: E1202 09:24:31.485983 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ntjsd" podUID="9170fc18-624b-4358-b931-7e889eee7317" Dec 02 09:24:31 crc kubenswrapper[4781]: E1202 09:24:31.571774 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 02 09:24:31 crc kubenswrapper[4781]: E1202 09:24:31.571989 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5pqjd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5f85z_openshift-marketplace(69ab05d8-7b1c-4314-aa8c-57d38ae2e885): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 09:24:31 crc kubenswrapper[4781]: E1202 09:24:31.573199 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5f85z" podUID="69ab05d8-7b1c-4314-aa8c-57d38ae2e885" Dec 02 09:24:32 crc kubenswrapper[4781]: E1202 09:24:32.393413 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5f85z" podUID="69ab05d8-7b1c-4314-aa8c-57d38ae2e885" Dec 02 09:24:32 crc kubenswrapper[4781]: E1202 09:24:32.393585 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ntjsd" podUID="9170fc18-624b-4358-b931-7e889eee7317" Dec 02 09:24:32 crc kubenswrapper[4781]: E1202 09:24:32.450083 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 02 09:24:32 crc kubenswrapper[4781]: E1202 09:24:32.450261 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7lmdl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-8d48v_openshift-marketplace(5e5610cf-1c3f-4010-a4ee-4820372400c1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 09:24:32 crc kubenswrapper[4781]: E1202 09:24:32.451440 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-8d48v" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" Dec 02 09:24:32 crc kubenswrapper[4781]: E1202 09:24:32.487936 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 02 09:24:32 crc kubenswrapper[4781]: E1202 09:24:32.488109 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cpxf4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-8pgk8_openshift-marketplace(c4734fd7-42d6-4b87-9160-5a3471f91d03): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 09:24:32 crc kubenswrapper[4781]: E1202 09:24:32.489751 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-8pgk8" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" Dec 02 09:24:35 crc kubenswrapper[4781]: E1202 09:24:35.137699 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-8pgk8" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" Dec 02 09:24:35 crc kubenswrapper[4781]: E1202 09:24:35.137750 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-8d48v" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" Dec 02 09:24:35 crc kubenswrapper[4781]: E1202 09:24:35.215394 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 02 09:24:35 crc kubenswrapper[4781]: E1202 09:24:35.215661 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n7wbd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-9bkjh_openshift-marketplace(2bbd8e20-1afd-4e4a-a689-e77ae2042bfb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 09:24:35 crc kubenswrapper[4781]: E1202 09:24:35.217067 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-9bkjh" podUID="2bbd8e20-1afd-4e4a-a689-e77ae2042bfb" Dec 02 09:24:35 crc kubenswrapper[4781]: E1202 09:24:35.240475 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 02 09:24:35 crc kubenswrapper[4781]: E1202 09:24:35.240649 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lkkgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-4rfsl_openshift-marketplace(f37a1838-a130-4cb4-807f-2214cbfefdbf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 09:24:35 crc kubenswrapper[4781]: E1202 09:24:35.242123 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-4rfsl" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" Dec 02 09:24:36 crc kubenswrapper[4781]: E1202 09:24:36.444786 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-9bkjh" podUID="2bbd8e20-1afd-4e4a-a689-e77ae2042bfb" Dec 02 09:24:36 crc kubenswrapper[4781]: E1202 09:24:36.444855 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-4rfsl" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" Dec 02 09:24:36 crc kubenswrapper[4781]: E1202 09:24:36.509966 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 02 09:24:36 crc kubenswrapper[4781]: E1202 09:24:36.510415 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-87j5w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-844r4_openshift-marketplace(b28a19d5-e49e-46f0-942d-dc9f96777c2d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 09:24:36 crc kubenswrapper[4781]: E1202 09:24:36.511819 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-844r4" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" Dec 02 09:24:36 crc kubenswrapper[4781]: E1202 09:24:36.537140 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 02 09:24:36 crc kubenswrapper[4781]: E1202 09:24:36.537289 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-45gcq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-t59n7_openshift-marketplace(0a0d43df-4480-4b62-bd3d-d129fbdcd722): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 09:24:36 crc kubenswrapper[4781]: E1202 09:24:36.538474 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-t59n7" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" Dec 02 09:24:36 crc kubenswrapper[4781]: I1202 09:24:36.880787 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 09:24:36 crc kubenswrapper[4781]: I1202 09:24:36.919353 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 09:24:36 crc kubenswrapper[4781]: W1202 09:24:36.934506 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5df92e64_6fc4_4395_a919_b1c46c30d318.slice/crio-3853e7d7ea4f932f4c9b110c3f667818bacfafb4fe6738a0a513c17759aae332 WatchSource:0}: Error finding container 3853e7d7ea4f932f4c9b110c3f667818bacfafb4fe6738a0a513c17759aae332: Status 404 returned error can't find the container with id 3853e7d7ea4f932f4c9b110c3f667818bacfafb4fe6738a0a513c17759aae332 Dec 02 09:24:37 crc kubenswrapper[4781]: I1202 09:24:37.334658 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ecddc0fd-4c27-4b71-9f6e-86c294013601","Type":"ContainerStarted","Data":"0a6c424f66637a7411cf28742070d356995095682a1d87726e5a44c32f841e01"} Dec 02 09:24:37 crc kubenswrapper[4781]: I1202 09:24:37.336234 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q792g" event={"ID":"bcdae8ff-3e82-4785-b958-a98717a14787","Type":"ContainerStarted","Data":"b9763312f256074baea69c283a090a87550a7377e8f3a4d0ffedf0572b42c9d5"} Dec 02 09:24:37 crc kubenswrapper[4781]: I1202 09:24:37.338134 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"ebcdebd6c3481dcb7b7c492991920a861406e047665bbf99c0df4f3ff4534e5b"} Dec 02 09:24:37 crc kubenswrapper[4781]: I1202 09:24:37.339930 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5df92e64-6fc4-4395-a919-b1c46c30d318","Type":"ContainerStarted","Data":"3853e7d7ea4f932f4c9b110c3f667818bacfafb4fe6738a0a513c17759aae332"} Dec 02 09:24:37 crc kubenswrapper[4781]: E1202 09:24:37.341559 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-844r4" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" Dec 02 09:24:37 crc kubenswrapper[4781]: E1202 09:24:37.341706 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-t59n7" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" Dec 02 09:24:37 crc kubenswrapper[4781]: I1202 09:24:37.353954 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-q792g" podStartSLOduration=221.35393368 podStartE2EDuration="3m41.35393368s" podCreationTimestamp="2025-12-02 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:24:37.352203642 +0000 UTC m=+240.176077531" watchObservedRunningTime="2025-12-02 09:24:37.35393368 +0000 UTC m=+240.177807559" Dec 02 09:24:38 crc kubenswrapper[4781]: I1202 09:24:38.347883 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5df92e64-6fc4-4395-a919-b1c46c30d318","Type":"ContainerStarted","Data":"b379c0d2eac368298eab4be8f662fdc974c6249f084832b7e5d41418b80e58ca"} Dec 02 09:24:38 crc kubenswrapper[4781]: I1202 09:24:38.351436 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ecddc0fd-4c27-4b71-9f6e-86c294013601","Type":"ContainerStarted","Data":"c1c82505d0c3c15bc1e48c2701c4975fec6a03d9f350393a99f2680e94f2ebd7"} Dec 02 09:24:38 crc kubenswrapper[4781]: I1202 09:24:38.370618 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=51.370594498 podStartE2EDuration="51.370594498s" podCreationTimestamp="2025-12-02 09:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:24:38.365281612 +0000 UTC m=+241.189155491" watchObservedRunningTime="2025-12-02 09:24:38.370594498 +0000 UTC m=+241.194468387" Dec 02 09:24:38 crc kubenswrapper[4781]: I1202 09:24:38.386470 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=57.386445868 podStartE2EDuration="57.386445868s" podCreationTimestamp="2025-12-02 09:23:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:24:38.383582519 +0000 UTC m=+241.207456398" watchObservedRunningTime="2025-12-02 09:24:38.386445868 +0000 UTC m=+241.210319777" Dec 02 09:24:39 crc kubenswrapper[4781]: I1202 09:24:39.357626 4781 generic.go:334] "Generic (PLEG): container finished" podID="ecddc0fd-4c27-4b71-9f6e-86c294013601" containerID="c1c82505d0c3c15bc1e48c2701c4975fec6a03d9f350393a99f2680e94f2ebd7" exitCode=0 Dec 02 09:24:39 crc kubenswrapper[4781]: I1202 09:24:39.357705 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ecddc0fd-4c27-4b71-9f6e-86c294013601","Type":"ContainerDied","Data":"c1c82505d0c3c15bc1e48c2701c4975fec6a03d9f350393a99f2680e94f2ebd7"} Dec 02 09:24:40 crc kubenswrapper[4781]: I1202 09:24:40.585207 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xqr9t"] Dec 02 09:24:40 crc kubenswrapper[4781]: I1202 09:24:40.627466 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 09:24:40 crc kubenswrapper[4781]: I1202 09:24:40.772086 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecddc0fd-4c27-4b71-9f6e-86c294013601-kubelet-dir\") pod \"ecddc0fd-4c27-4b71-9f6e-86c294013601\" (UID: \"ecddc0fd-4c27-4b71-9f6e-86c294013601\") " Dec 02 09:24:40 crc kubenswrapper[4781]: I1202 09:24:40.772216 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecddc0fd-4c27-4b71-9f6e-86c294013601-kube-api-access\") pod \"ecddc0fd-4c27-4b71-9f6e-86c294013601\" (UID: \"ecddc0fd-4c27-4b71-9f6e-86c294013601\") " Dec 02 09:24:40 crc kubenswrapper[4781]: I1202 09:24:40.772244 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecddc0fd-4c27-4b71-9f6e-86c294013601-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ecddc0fd-4c27-4b71-9f6e-86c294013601" (UID: "ecddc0fd-4c27-4b71-9f6e-86c294013601"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:24:40 crc kubenswrapper[4781]: I1202 09:24:40.772455 4781 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecddc0fd-4c27-4b71-9f6e-86c294013601-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 09:24:40 crc kubenswrapper[4781]: I1202 09:24:40.778129 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecddc0fd-4c27-4b71-9f6e-86c294013601-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ecddc0fd-4c27-4b71-9f6e-86c294013601" (UID: "ecddc0fd-4c27-4b71-9f6e-86c294013601"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:24:40 crc kubenswrapper[4781]: I1202 09:24:40.874109 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecddc0fd-4c27-4b71-9f6e-86c294013601-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 09:24:41 crc kubenswrapper[4781]: I1202 09:24:41.368785 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ecddc0fd-4c27-4b71-9f6e-86c294013601","Type":"ContainerDied","Data":"0a6c424f66637a7411cf28742070d356995095682a1d87726e5a44c32f841e01"} Dec 02 09:24:41 crc kubenswrapper[4781]: I1202 09:24:41.369102 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a6c424f66637a7411cf28742070d356995095682a1d87726e5a44c32f841e01" Dec 02 09:24:41 crc kubenswrapper[4781]: I1202 09:24:41.368900 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 09:24:45 crc kubenswrapper[4781]: I1202 09:24:45.401960 4781 generic.go:334] "Generic (PLEG): container finished" podID="9170fc18-624b-4358-b931-7e889eee7317" containerID="52456d0680b8c19b41afd79907b2e6286093b7509fa7f000ee9a6c3cd09b8480" exitCode=0 Dec 02 09:24:45 crc kubenswrapper[4781]: I1202 09:24:45.402221 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntjsd" event={"ID":"9170fc18-624b-4358-b931-7e889eee7317","Type":"ContainerDied","Data":"52456d0680b8c19b41afd79907b2e6286093b7509fa7f000ee9a6c3cd09b8480"} Dec 02 09:24:46 crc kubenswrapper[4781]: I1202 09:24:46.410460 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntjsd" event={"ID":"9170fc18-624b-4358-b931-7e889eee7317","Type":"ContainerStarted","Data":"76e2b55fa1a2f93fe2fd7baff904c209ec95260b27aadfa4157e969ff570bd0c"} Dec 02 09:24:46 crc kubenswrapper[4781]: I1202 09:24:46.426815 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ntjsd" podStartSLOduration=6.054595184 podStartE2EDuration="1m43.426797703s" podCreationTimestamp="2025-12-02 09:23:03 +0000 UTC" firstStartedPulling="2025-12-02 09:23:08.733177095 +0000 UTC m=+151.557050974" lastFinishedPulling="2025-12-02 09:24:46.105379614 +0000 UTC m=+248.929253493" observedRunningTime="2025-12-02 09:24:46.426431313 +0000 UTC m=+249.250305212" watchObservedRunningTime="2025-12-02 09:24:46.426797703 +0000 UTC m=+249.250671582" Dec 02 09:24:48 crc kubenswrapper[4781]: I1202 09:24:48.419995 4781 generic.go:334] "Generic (PLEG): container finished" podID="69ab05d8-7b1c-4314-aa8c-57d38ae2e885" containerID="d68c74ec5c5cc62a3386eeed61ef9913209f4644faffd74ac9432ecf2f1c3f9f" exitCode=0 Dec 02 09:24:48 crc kubenswrapper[4781]: I1202 09:24:48.420388 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5f85z" event={"ID":"69ab05d8-7b1c-4314-aa8c-57d38ae2e885","Type":"ContainerDied","Data":"d68c74ec5c5cc62a3386eeed61ef9913209f4644faffd74ac9432ecf2f1c3f9f"} Dec 02 09:24:49 crc kubenswrapper[4781]: I1202 09:24:49.457176 4781 generic.go:334] "Generic (PLEG): container finished" podID="2bbd8e20-1afd-4e4a-a689-e77ae2042bfb" containerID="33003d9ac151d48b316de8bbaaf0bafeca1564b76155bb50d3cf06adb72ccdcf" exitCode=0 Dec 02 09:24:49 crc kubenswrapper[4781]: I1202 09:24:49.457531 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bkjh" event={"ID":"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb","Type":"ContainerDied","Data":"33003d9ac151d48b316de8bbaaf0bafeca1564b76155bb50d3cf06adb72ccdcf"} Dec 02 09:24:50 crc kubenswrapper[4781]: I1202 09:24:50.464760 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5f85z" event={"ID":"69ab05d8-7b1c-4314-aa8c-57d38ae2e885","Type":"ContainerStarted","Data":"d12ce6cb0aecc5052a9e0ef983084f6cc2a65b98495c01890eacded4a8a1aed0"} Dec 02 09:24:50 crc kubenswrapper[4781]: I1202 09:24:50.466449 4781 generic.go:334] "Generic (PLEG): container finished" podID="5e5610cf-1c3f-4010-a4ee-4820372400c1" containerID="b2b5b902915f6abcc30db04918b15e3bd8b36b9bd586103997e094fee178c945" exitCode=0 Dec 02 09:24:50 crc kubenswrapper[4781]: I1202 09:24:50.466527 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8d48v" event={"ID":"5e5610cf-1c3f-4010-a4ee-4820372400c1","Type":"ContainerDied","Data":"b2b5b902915f6abcc30db04918b15e3bd8b36b9bd586103997e094fee178c945"} Dec 02 09:24:50 crc kubenswrapper[4781]: I1202 09:24:50.491894 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5f85z" podStartSLOduration=5.998409946 podStartE2EDuration="1m46.491875774s" podCreationTimestamp="2025-12-02 09:23:04 +0000 UTC" firstStartedPulling="2025-12-02 09:23:08.732743483 +0000 UTC m=+151.556617362" lastFinishedPulling="2025-12-02 09:24:49.226209311 +0000 UTC m=+252.050083190" observedRunningTime="2025-12-02 09:24:50.487981976 +0000 UTC m=+253.311855865" watchObservedRunningTime="2025-12-02 09:24:50.491875774 +0000 UTC m=+253.315749673" Dec 02 09:24:54 crc kubenswrapper[4781]: I1202 09:24:54.215870 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ntjsd" Dec 02 09:24:54 crc kubenswrapper[4781]: I1202 09:24:54.216400 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ntjsd" Dec 02 09:24:54 crc kubenswrapper[4781]: I1202 09:24:54.345728 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ntjsd" Dec 02 09:24:54 crc kubenswrapper[4781]: I1202 09:24:54.486621 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8d48v" event={"ID":"5e5610cf-1c3f-4010-a4ee-4820372400c1","Type":"ContainerStarted","Data":"84b5a37e9e31e34b269f94a9df746ac614336bebfa021099b5a461088c7a89e8"} Dec 02 09:24:54 crc kubenswrapper[4781]: I1202 09:24:54.526603 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ntjsd" Dec 02 09:24:54 crc kubenswrapper[4781]: I1202 09:24:54.724072 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5f85z" Dec 02 09:24:54 crc kubenswrapper[4781]: I1202 09:24:54.724151 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5f85z" Dec 02 09:24:54 crc kubenswrapper[4781]: I1202 09:24:54.772740 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5f85z" Dec 02 09:24:55 crc kubenswrapper[4781]: I1202 09:24:55.498018 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bkjh" event={"ID":"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb","Type":"ContainerStarted","Data":"03cc7cb7bff43f5e8d2a752a3be22d227008d5b26e1f58a433b6922572fcfe30"} Dec 02 09:24:55 crc kubenswrapper[4781]: I1202 09:24:55.527983 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8d48v" podStartSLOduration=6.79853961 podStartE2EDuration="1m49.527962019s" podCreationTimestamp="2025-12-02 09:23:06 +0000 UTC" firstStartedPulling="2025-12-02 09:23:08.747557319 +0000 UTC m=+151.571431198" lastFinishedPulling="2025-12-02 09:24:51.476979728 +0000 UTC m=+254.300853607" observedRunningTime="2025-12-02 09:24:55.526350184 +0000 UTC m=+258.350224063" watchObservedRunningTime="2025-12-02 09:24:55.527962019 +0000 UTC m=+258.351835908" Dec 02 09:24:55 crc kubenswrapper[4781]: I1202 09:24:55.544405 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5f85z" Dec 02 09:24:56 crc kubenswrapper[4781]: I1202 09:24:56.503739 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-844r4" event={"ID":"b28a19d5-e49e-46f0-942d-dc9f96777c2d","Type":"ContainerStarted","Data":"9434a0fe428189ccb4720a2f27db8c4863ab8ce8e708e6acaf092375da20f15b"} Dec 02 09:24:56 crc kubenswrapper[4781]: I1202 09:24:56.739282 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8d48v" Dec 02 09:24:56 crc kubenswrapper[4781]: I1202 09:24:56.739569 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8d48v" Dec 02 09:24:56 crc kubenswrapper[4781]: I1202 09:24:56.783488 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8d48v" Dec 02 09:24:57 crc kubenswrapper[4781]: I1202 09:24:57.643869 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5f85z"] Dec 02 09:24:57 crc kubenswrapper[4781]: I1202 09:24:57.644192 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5f85z" podUID="69ab05d8-7b1c-4314-aa8c-57d38ae2e885" containerName="registry-server" containerID="cri-o://d12ce6cb0aecc5052a9e0ef983084f6cc2a65b98495c01890eacded4a8a1aed0" gracePeriod=2 Dec 02 09:24:58 crc kubenswrapper[4781]: I1202 09:24:58.516089 4781 generic.go:334] "Generic (PLEG): container finished" podID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" containerID="9434a0fe428189ccb4720a2f27db8c4863ab8ce8e708e6acaf092375da20f15b" exitCode=0 Dec 02 09:24:58 crc kubenswrapper[4781]: I1202 09:24:58.516191 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-844r4" event={"ID":"b28a19d5-e49e-46f0-942d-dc9f96777c2d","Type":"ContainerDied","Data":"9434a0fe428189ccb4720a2f27db8c4863ab8ce8e708e6acaf092375da20f15b"} Dec 02 09:24:58 crc kubenswrapper[4781]: I1202 09:24:58.537679 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9bkjh" podStartSLOduration=9.802590795 podStartE2EDuration="1m51.53766322s" podCreationTimestamp="2025-12-02 09:23:07 +0000 UTC" firstStartedPulling="2025-12-02 09:23:09.747382095 +0000 UTC m=+152.571255974" lastFinishedPulling="2025-12-02 09:24:51.48245452 +0000 UTC m=+254.306328399" observedRunningTime="2025-12-02 09:24:58.535685754 +0000 UTC m=+261.359559643" watchObservedRunningTime="2025-12-02 09:24:58.53766322 +0000 UTC m=+261.361537099" Dec 02 09:25:00 crc kubenswrapper[4781]: I1202 09:25:00.527313 4781 generic.go:334] "Generic (PLEG): container finished" podID="69ab05d8-7b1c-4314-aa8c-57d38ae2e885" containerID="d12ce6cb0aecc5052a9e0ef983084f6cc2a65b98495c01890eacded4a8a1aed0" exitCode=0 Dec 02 09:25:00 crc kubenswrapper[4781]: I1202 09:25:00.527360 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5f85z" event={"ID":"69ab05d8-7b1c-4314-aa8c-57d38ae2e885","Type":"ContainerDied","Data":"d12ce6cb0aecc5052a9e0ef983084f6cc2a65b98495c01890eacded4a8a1aed0"} Dec 02 09:25:04 crc kubenswrapper[4781]: I1202 09:25:04.120804 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5f85z" Dec 02 09:25:04 crc kubenswrapper[4781]: I1202 09:25:04.161298 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-catalog-content\") pod \"69ab05d8-7b1c-4314-aa8c-57d38ae2e885\" (UID: \"69ab05d8-7b1c-4314-aa8c-57d38ae2e885\") " Dec 02 09:25:04 crc kubenswrapper[4781]: I1202 09:25:04.161364 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pqjd\" (UniqueName: \"kubernetes.io/projected/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-kube-api-access-5pqjd\") pod \"69ab05d8-7b1c-4314-aa8c-57d38ae2e885\" (UID: \"69ab05d8-7b1c-4314-aa8c-57d38ae2e885\") " Dec 02 09:25:04 crc kubenswrapper[4781]: I1202 09:25:04.161408 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-utilities\") pod \"69ab05d8-7b1c-4314-aa8c-57d38ae2e885\" (UID: \"69ab05d8-7b1c-4314-aa8c-57d38ae2e885\") " Dec 02 09:25:04 crc kubenswrapper[4781]: I1202 09:25:04.162295 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-utilities" (OuterVolumeSpecName: "utilities") pod "69ab05d8-7b1c-4314-aa8c-57d38ae2e885" (UID: "69ab05d8-7b1c-4314-aa8c-57d38ae2e885"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:25:04 crc kubenswrapper[4781]: I1202 09:25:04.166470 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-kube-api-access-5pqjd" (OuterVolumeSpecName: "kube-api-access-5pqjd") pod "69ab05d8-7b1c-4314-aa8c-57d38ae2e885" (UID: "69ab05d8-7b1c-4314-aa8c-57d38ae2e885"). InnerVolumeSpecName "kube-api-access-5pqjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:25:04 crc kubenswrapper[4781]: I1202 09:25:04.208027 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69ab05d8-7b1c-4314-aa8c-57d38ae2e885" (UID: "69ab05d8-7b1c-4314-aa8c-57d38ae2e885"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:25:04 crc kubenswrapper[4781]: I1202 09:25:04.263146 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:04 crc kubenswrapper[4781]: I1202 09:25:04.263188 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pqjd\" (UniqueName: \"kubernetes.io/projected/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-kube-api-access-5pqjd\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:04 crc kubenswrapper[4781]: I1202 09:25:04.263200 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69ab05d8-7b1c-4314-aa8c-57d38ae2e885-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:04 crc kubenswrapper[4781]: I1202 09:25:04.549894 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5f85z" event={"ID":"69ab05d8-7b1c-4314-aa8c-57d38ae2e885","Type":"ContainerDied","Data":"a505efafd52bfc5af7cc5405e132cd80711d2bc0e6fbe34a3c6597105c832243"} Dec 02 09:25:04 crc kubenswrapper[4781]: I1202 09:25:04.550161 4781 scope.go:117] "RemoveContainer" containerID="d12ce6cb0aecc5052a9e0ef983084f6cc2a65b98495c01890eacded4a8a1aed0" Dec 02 09:25:04 crc kubenswrapper[4781]: I1202 09:25:04.550016 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5f85z" Dec 02 09:25:04 crc kubenswrapper[4781]: I1202 09:25:04.584793 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5f85z"] Dec 02 09:25:04 crc kubenswrapper[4781]: I1202 09:25:04.588877 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5f85z"] Dec 02 09:25:05 crc kubenswrapper[4781]: I1202 09:25:05.505558 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69ab05d8-7b1c-4314-aa8c-57d38ae2e885" path="/var/lib/kubelet/pods/69ab05d8-7b1c-4314-aa8c-57d38ae2e885/volumes" Dec 02 09:25:05 crc kubenswrapper[4781]: I1202 09:25:05.612204 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" containerName="oauth-openshift" containerID="cri-o://0485e7643026ac1eff407e1fb336969ac13eb338614c1badd2cf6f018962ece0" gracePeriod=15 Dec 02 09:25:05 crc kubenswrapper[4781]: I1202 09:25:05.745001 4781 scope.go:117] "RemoveContainer" containerID="d68c74ec5c5cc62a3386eeed61ef9913209f4644faffd74ac9432ecf2f1c3f9f" Dec 02 09:25:06 crc kubenswrapper[4781]: I1202 09:25:06.561755 4781 generic.go:334] "Generic (PLEG): container finished" podID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" containerID="0485e7643026ac1eff407e1fb336969ac13eb338614c1badd2cf6f018962ece0" exitCode=0 Dec 02 09:25:06 crc kubenswrapper[4781]: I1202 09:25:06.561803 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" event={"ID":"b7602882-7ecf-4ce4-9cfa-3a691b3f9270","Type":"ContainerDied","Data":"0485e7643026ac1eff407e1fb336969ac13eb338614c1badd2cf6f018962ece0"} Dec 02 09:25:06 crc kubenswrapper[4781]: I1202 09:25:06.793461 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8d48v" Dec 02 09:25:07 crc kubenswrapper[4781]: I1202 09:25:07.575685 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9bkjh" Dec 02 09:25:07 crc kubenswrapper[4781]: I1202 09:25:07.575751 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9bkjh" Dec 02 09:25:07 crc kubenswrapper[4781]: I1202 09:25:07.648834 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9bkjh" Dec 02 09:25:08 crc kubenswrapper[4781]: I1202 09:25:08.621775 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9bkjh" Dec 02 09:25:10 crc kubenswrapper[4781]: I1202 09:25:10.238205 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8d48v"] Dec 02 09:25:10 crc kubenswrapper[4781]: I1202 09:25:10.238858 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8d48v" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" containerName="registry-server" containerID="cri-o://84b5a37e9e31e34b269f94a9df746ac614336bebfa021099b5a461088c7a89e8" gracePeriod=2 Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.312760 4781 scope.go:117] "RemoveContainer" containerID="81263bcef9dd5f6862cde4a7b86658cbb713c144bde22195c7495383610ae853" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.610317 4781 generic.go:334] "Generic (PLEG): container finished" podID="5e5610cf-1c3f-4010-a4ee-4820372400c1" containerID="84b5a37e9e31e34b269f94a9df746ac614336bebfa021099b5a461088c7a89e8" exitCode=0 Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.610417 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8d48v" event={"ID":"5e5610cf-1c3f-4010-a4ee-4820372400c1","Type":"ContainerDied","Data":"84b5a37e9e31e34b269f94a9df746ac614336bebfa021099b5a461088c7a89e8"} Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.777554 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.810901 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djmn4\" (UniqueName: \"kubernetes.io/projected/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-kube-api-access-djmn4\") pod \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.810953 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-error\") pod \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.810972 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-login\") pod \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.810994 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-provider-selection\") pod \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.811028 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-trusted-ca-bundle\") pod \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.811054 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-idp-0-file-data\") pod \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.811070 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-cliconfig\") pod \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.811110 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-audit-dir\") pod \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.811142 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-ocp-branding-template\") pod \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.811162 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-serving-cert\") pod \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.811199 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-router-certs\") pod \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.811227 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-audit-policies\") pod \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.811272 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-session\") pod \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.811292 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-service-ca\") pod \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\" (UID: \"b7602882-7ecf-4ce4-9cfa-3a691b3f9270\") " Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.812191 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "b7602882-7ecf-4ce4-9cfa-3a691b3f9270" (UID: "b7602882-7ecf-4ce4-9cfa-3a691b3f9270"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.812220 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "b7602882-7ecf-4ce4-9cfa-3a691b3f9270" (UID: "b7602882-7ecf-4ce4-9cfa-3a691b3f9270"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.812251 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b7602882-7ecf-4ce4-9cfa-3a691b3f9270" (UID: "b7602882-7ecf-4ce4-9cfa-3a691b3f9270"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.819398 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "b7602882-7ecf-4ce4-9cfa-3a691b3f9270" (UID: "b7602882-7ecf-4ce4-9cfa-3a691b3f9270"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.819413 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "b7602882-7ecf-4ce4-9cfa-3a691b3f9270" (UID: "b7602882-7ecf-4ce4-9cfa-3a691b3f9270"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.820406 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-86f4ddc759-7xxp5"] Dec 02 09:25:14 crc kubenswrapper[4781]: E1202 09:25:14.820744 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ab05d8-7b1c-4314-aa8c-57d38ae2e885" containerName="registry-server" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.820850 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ab05d8-7b1c-4314-aa8c-57d38ae2e885" containerName="registry-server" Dec 02 09:25:14 crc kubenswrapper[4781]: E1202 09:25:14.820948 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ab05d8-7b1c-4314-aa8c-57d38ae2e885" containerName="extract-utilities" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.821035 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ab05d8-7b1c-4314-aa8c-57d38ae2e885" containerName="extract-utilities" Dec 02 09:25:14 crc kubenswrapper[4781]: E1202 09:25:14.821604 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecddc0fd-4c27-4b71-9f6e-86c294013601" containerName="pruner" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.821706 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecddc0fd-4c27-4b71-9f6e-86c294013601" containerName="pruner" Dec 02 09:25:14 crc kubenswrapper[4781]: E1202 09:25:14.821781 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ab05d8-7b1c-4314-aa8c-57d38ae2e885" containerName="extract-content" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.821853 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ab05d8-7b1c-4314-aa8c-57d38ae2e885" containerName="extract-content" Dec 02 09:25:14 crc kubenswrapper[4781]: E1202 09:25:14.821951 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" containerName="oauth-openshift" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.822040 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" containerName="oauth-openshift" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.822293 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="69ab05d8-7b1c-4314-aa8c-57d38ae2e885" containerName="registry-server" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.851031 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecddc0fd-4c27-4b71-9f6e-86c294013601" containerName="pruner" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.851520 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" containerName="oauth-openshift" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.852087 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86f4ddc759-7xxp5"] Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.852174 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.835616 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "b7602882-7ecf-4ce4-9cfa-3a691b3f9270" (UID: "b7602882-7ecf-4ce4-9cfa-3a691b3f9270"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.837250 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-kube-api-access-djmn4" (OuterVolumeSpecName: "kube-api-access-djmn4") pod "b7602882-7ecf-4ce4-9cfa-3a691b3f9270" (UID: "b7602882-7ecf-4ce4-9cfa-3a691b3f9270"). InnerVolumeSpecName "kube-api-access-djmn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.839329 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "b7602882-7ecf-4ce4-9cfa-3a691b3f9270" (UID: "b7602882-7ecf-4ce4-9cfa-3a691b3f9270"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.849441 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "b7602882-7ecf-4ce4-9cfa-3a691b3f9270" (UID: "b7602882-7ecf-4ce4-9cfa-3a691b3f9270"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.851753 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "b7602882-7ecf-4ce4-9cfa-3a691b3f9270" (UID: "b7602882-7ecf-4ce4-9cfa-3a691b3f9270"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.856478 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "b7602882-7ecf-4ce4-9cfa-3a691b3f9270" (UID: "b7602882-7ecf-4ce4-9cfa-3a691b3f9270"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.857960 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "b7602882-7ecf-4ce4-9cfa-3a691b3f9270" (UID: "b7602882-7ecf-4ce4-9cfa-3a691b3f9270"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.858619 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "b7602882-7ecf-4ce4-9cfa-3a691b3f9270" (UID: "b7602882-7ecf-4ce4-9cfa-3a691b3f9270"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.865405 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "b7602882-7ecf-4ce4-9cfa-3a691b3f9270" (UID: "b7602882-7ecf-4ce4-9cfa-3a691b3f9270"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.912740 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-service-ca\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.912785 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-audit-policies\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.912806 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x492j\" (UniqueName: \"kubernetes.io/projected/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-kube-api-access-x492j\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.912832 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.912855 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.912899 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.912941 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.912966 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.912984 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-user-template-login\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913002 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913068 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-audit-dir\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913085 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-session\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913109 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-router-certs\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913127 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-user-template-error\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913162 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913172 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913182 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djmn4\" (UniqueName: \"kubernetes.io/projected/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-kube-api-access-djmn4\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913193 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913203 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913248 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913341 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913593 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913609 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913622 4781 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913636 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913651 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913665 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:14 crc kubenswrapper[4781]: I1202 09:25:14.913679 4781 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b7602882-7ecf-4ce4-9cfa-3a691b3f9270-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.015226 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-service-ca\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.015309 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x492j\" (UniqueName: \"kubernetes.io/projected/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-kube-api-access-x492j\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.015385 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-audit-policies\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.015440 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.015498 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.015553 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.015601 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.015648 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.015686 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-user-template-login\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.015727 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.015763 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-audit-dir\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.015796 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-session\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.015846 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-router-certs\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.015884 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-user-template-error\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.016811 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-service-ca\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.016966 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.017876 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.018527 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-audit-policies\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.019013 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-audit-dir\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.022487 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.022981 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.023510 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-session\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.023955 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.024448 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-system-router-certs\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.025752 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-user-template-login\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.026235 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-user-template-error\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.032360 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.033072 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x492j\" (UniqueName: \"kubernetes.io/projected/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb-kube-api-access-x492j\") pod \"oauth-openshift-86f4ddc759-7xxp5\" (UID: \"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\") " pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.193222 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.198176 4781 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-xqr9t container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.5:6443/healthz\": dial tcp 10.217.0.5:6443: i/o timeout (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.198234 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.5:6443/healthz\": dial tcp 10.217.0.5:6443: i/o timeout (Client.Timeout exceeded while awaiting headers)" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.575371 4781 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.576209 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.576862 4781 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.577443 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d" gracePeriod=15 Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.577515 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb" gracePeriod=15 Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.577497 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa" gracePeriod=15 Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.577547 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719" gracePeriod=15 Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.577634 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8" gracePeriod=15 Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.578027 4781 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 09:25:15 crc kubenswrapper[4781]: E1202 09:25:15.578156 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.578166 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 09:25:15 crc kubenswrapper[4781]: E1202 09:25:15.578178 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.578184 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 09:25:15 crc kubenswrapper[4781]: E1202 09:25:15.578193 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.578199 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 02 09:25:15 crc kubenswrapper[4781]: E1202 09:25:15.578208 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.578214 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 09:25:15 crc kubenswrapper[4781]: E1202 09:25:15.578222 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.578229 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 09:25:15 crc kubenswrapper[4781]: E1202 09:25:15.578235 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.578241 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 09:25:15 crc kubenswrapper[4781]: E1202 09:25:15.578252 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.578257 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 09:25:15 crc kubenswrapper[4781]: E1202 09:25:15.578266 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.578272 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.578384 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.578394 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.578400 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.578411 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.578419 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.578426 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.578594 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.623500 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.624308 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.624387 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.624428 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.624474 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.624528 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.624606 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.624645 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.624710 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.630663 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" event={"ID":"b7602882-7ecf-4ce4-9cfa-3a691b3f9270","Type":"ContainerDied","Data":"6cd9150045b3a149957ee05f6251338340bbdeff9bb96634f5abadbd8f1bc942"} Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.630915 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.631821 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.632199 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.632609 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.635453 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.635730 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.635997 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.726429 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.726577 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.726787 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.727094 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.727150 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.727216 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.727266 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.727297 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.727364 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.727351 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.727455 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.727513 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.727514 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.727556 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.727581 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.727894 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:15 crc kubenswrapper[4781]: I1202 09:25:15.925382 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.236475 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8d48v" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.237236 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.237435 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.237717 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.238167 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.320589 4781 scope.go:117] "RemoveContainer" containerID="0485e7643026ac1eff407e1fb336969ac13eb338614c1badd2cf6f018962ece0" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.334815 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5610cf-1c3f-4010-a4ee-4820372400c1-utilities\") pod \"5e5610cf-1c3f-4010-a4ee-4820372400c1\" (UID: \"5e5610cf-1c3f-4010-a4ee-4820372400c1\") " Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.334999 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5610cf-1c3f-4010-a4ee-4820372400c1-catalog-content\") pod \"5e5610cf-1c3f-4010-a4ee-4820372400c1\" (UID: \"5e5610cf-1c3f-4010-a4ee-4820372400c1\") " Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.335088 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lmdl\" (UniqueName: \"kubernetes.io/projected/5e5610cf-1c3f-4010-a4ee-4820372400c1-kube-api-access-7lmdl\") pod \"5e5610cf-1c3f-4010-a4ee-4820372400c1\" (UID: \"5e5610cf-1c3f-4010-a4ee-4820372400c1\") " Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.336051 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e5610cf-1c3f-4010-a4ee-4820372400c1-utilities" (OuterVolumeSpecName: "utilities") pod "5e5610cf-1c3f-4010-a4ee-4820372400c1" (UID: "5e5610cf-1c3f-4010-a4ee-4820372400c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:25:16 crc kubenswrapper[4781]: E1202 09:25:16.339292 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-8pgk8.187d5bc1aa064974 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-8pgk8,UID:c4734fd7-42d6-4b87-9160-5a3471f91d03,APIVersion:v1,ResourceVersion:28288,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\" in 25.829s (25.829s including waiting). Image size: 1129027903 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 09:25:16.338456948 +0000 UTC m=+279.162330817,LastTimestamp:2025-12-02 09:25:16.338456948 +0000 UTC m=+279.162330817,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.339580 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e5610cf-1c3f-4010-a4ee-4820372400c1-kube-api-access-7lmdl" (OuterVolumeSpecName: "kube-api-access-7lmdl") pod "5e5610cf-1c3f-4010-a4ee-4820372400c1" (UID: "5e5610cf-1c3f-4010-a4ee-4820372400c1"). InnerVolumeSpecName "kube-api-access-7lmdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.352009 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e5610cf-1c3f-4010-a4ee-4820372400c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e5610cf-1c3f-4010-a4ee-4820372400c1" (UID: "5e5610cf-1c3f-4010-a4ee-4820372400c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:25:16 crc kubenswrapper[4781]: W1202 09:25:16.387957 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-768d1d0bb4f6f68a3454fdf941716d74d7ec074f367e6982512f249eb3f8dc38 WatchSource:0}: Error finding container 768d1d0bb4f6f68a3454fdf941716d74d7ec074f367e6982512f249eb3f8dc38: Status 404 returned error can't find the container with id 768d1d0bb4f6f68a3454fdf941716d74d7ec074f367e6982512f249eb3f8dc38 Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.436683 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5610cf-1c3f-4010-a4ee-4820372400c1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.436714 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lmdl\" (UniqueName: \"kubernetes.io/projected/5e5610cf-1c3f-4010-a4ee-4820372400c1-kube-api-access-7lmdl\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.436724 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5610cf-1c3f-4010-a4ee-4820372400c1-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.645204 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.646800 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.647703 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa" exitCode=0 Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.647736 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8" exitCode=0 Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.647748 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719" exitCode=0 Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.647760 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb" exitCode=2 Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.647836 4781 scope.go:117] "RemoveContainer" containerID="e1ea18bccf32314c5d384baf0c3a52d54c5668467596840ee1397399c18ab7f0" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.653277 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pgk8" event={"ID":"c4734fd7-42d6-4b87-9160-5a3471f91d03","Type":"ContainerStarted","Data":"ec9535118e1e08b778718f3770ed8a9a99709a23f3d4ed856e36a491596ff221"} Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.654576 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.654777 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.654996 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.655191 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.655573 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.659094 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8d48v" event={"ID":"5e5610cf-1c3f-4010-a4ee-4820372400c1","Type":"ContainerDied","Data":"45f718a8ef20054dc81ffccfb02f3a01180398d4c6ff5126953f53be8c08d080"} Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.659271 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8d48v" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.660435 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.661044 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.661595 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.662078 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.662438 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.673374 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t59n7" event={"ID":"0a0d43df-4480-4b62-bd3d-d129fbdcd722","Type":"ContainerStarted","Data":"12da46a80c1ab5d1ec3d83da4241389ab56aacfeabcef93a32503751f5f4a13d"} Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.674189 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.674583 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.675054 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.677168 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.677627 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.677994 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.678715 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.678828 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rfsl" event={"ID":"f37a1838-a130-4cb4-807f-2214cbfefdbf","Type":"ContainerStarted","Data":"efca497eaa2387782b2dcd379186015fe7ef578c950f2d847acc2e88bdda90f8"} Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.679883 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.680157 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.680418 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.680669 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"768d1d0bb4f6f68a3454fdf941716d74d7ec074f367e6982512f249eb3f8dc38"} Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.680674 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.681057 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.682807 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.683243 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.683433 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.683718 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.684080 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.684350 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.684560 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.684884 4781 generic.go:334] "Generic (PLEG): container finished" podID="5df92e64-6fc4-4395-a919-b1c46c30d318" containerID="b379c0d2eac368298eab4be8f662fdc974c6249f084832b7e5d41418b80e58ca" exitCode=0 Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.684983 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5df92e64-6fc4-4395-a919-b1c46c30d318","Type":"ContainerDied","Data":"b379c0d2eac368298eab4be8f662fdc974c6249f084832b7e5d41418b80e58ca"} Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.685646 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.685938 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.686196 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.688557 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.689216 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.689618 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.689849 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.690129 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.691007 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-844r4" event={"ID":"b28a19d5-e49e-46f0-942d-dc9f96777c2d","Type":"ContainerStarted","Data":"88a12b393ba226860379c973a8763acbf05e3bb790f5d35eec7ad9ddd42b07d9"} Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.691758 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.692079 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.692335 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.692635 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.692889 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.693158 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.693224 4781 scope.go:117] "RemoveContainer" containerID="84b5a37e9e31e34b269f94a9df746ac614336bebfa021099b5a461088c7a89e8" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.693412 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.693644 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.693854 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.708232 4781 scope.go:117] "RemoveContainer" containerID="b2b5b902915f6abcc30db04918b15e3bd8b36b9bd586103997e094fee178c945" Dec 02 09:25:16 crc kubenswrapper[4781]: I1202 09:25:16.726475 4781 scope.go:117] "RemoveContainer" containerID="47f1a0b6ce04c530d53529205258ac8dcb8c1a913fe37964cd4a0728da949624" Dec 02 09:25:17 crc kubenswrapper[4781]: E1202 09:25:17.047510 4781 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 02 09:25:17 crc kubenswrapper[4781]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-86f4ddc759-7xxp5_openshift-authentication_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb_0(3eb3a11a198666ea7b2e9102ec53d9d28bd14942a5e35707b6073535b6d6e21b): error adding pod openshift-authentication_oauth-openshift-86f4ddc759-7xxp5 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3eb3a11a198666ea7b2e9102ec53d9d28bd14942a5e35707b6073535b6d6e21b" Netns:"/var/run/netns/ed4c00e7-f5dd-42ae-8d5d-3b7fc921086f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-86f4ddc759-7xxp5;K8S_POD_INFRA_CONTAINER_ID=3eb3a11a198666ea7b2e9102ec53d9d28bd14942a5e35707b6073535b6d6e21b;K8S_POD_UID=7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-86f4ddc759-7xxp5] networking: Multus: [openshift-authentication/oauth-openshift-86f4ddc759-7xxp5/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-86f4ddc759-7xxp5 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-86f4ddc759-7xxp5 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-86f4ddc759-7xxp5?timeout=1m0s": dial tcp 38.102.83.194:6443: connect: connection refused Dec 02 09:25:17 crc kubenswrapper[4781]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 02 09:25:17 crc kubenswrapper[4781]: > Dec 02 09:25:17 crc kubenswrapper[4781]: E1202 09:25:17.047828 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 02 09:25:17 crc kubenswrapper[4781]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-86f4ddc759-7xxp5_openshift-authentication_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb_0(3eb3a11a198666ea7b2e9102ec53d9d28bd14942a5e35707b6073535b6d6e21b): error adding pod openshift-authentication_oauth-openshift-86f4ddc759-7xxp5 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3eb3a11a198666ea7b2e9102ec53d9d28bd14942a5e35707b6073535b6d6e21b" Netns:"/var/run/netns/ed4c00e7-f5dd-42ae-8d5d-3b7fc921086f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-86f4ddc759-7xxp5;K8S_POD_INFRA_CONTAINER_ID=3eb3a11a198666ea7b2e9102ec53d9d28bd14942a5e35707b6073535b6d6e21b;K8S_POD_UID=7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-86f4ddc759-7xxp5] networking: Multus: [openshift-authentication/oauth-openshift-86f4ddc759-7xxp5/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-86f4ddc759-7xxp5 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-86f4ddc759-7xxp5 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-86f4ddc759-7xxp5?timeout=1m0s": dial tcp 38.102.83.194:6443: connect: connection refused Dec 02 09:25:17 crc kubenswrapper[4781]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 02 09:25:17 crc kubenswrapper[4781]: > pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:17 crc kubenswrapper[4781]: E1202 09:25:17.047853 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 02 09:25:17 crc kubenswrapper[4781]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-86f4ddc759-7xxp5_openshift-authentication_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb_0(3eb3a11a198666ea7b2e9102ec53d9d28bd14942a5e35707b6073535b6d6e21b): error adding pod openshift-authentication_oauth-openshift-86f4ddc759-7xxp5 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3eb3a11a198666ea7b2e9102ec53d9d28bd14942a5e35707b6073535b6d6e21b" Netns:"/var/run/netns/ed4c00e7-f5dd-42ae-8d5d-3b7fc921086f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-86f4ddc759-7xxp5;K8S_POD_INFRA_CONTAINER_ID=3eb3a11a198666ea7b2e9102ec53d9d28bd14942a5e35707b6073535b6d6e21b;K8S_POD_UID=7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-86f4ddc759-7xxp5] networking: Multus: [openshift-authentication/oauth-openshift-86f4ddc759-7xxp5/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-86f4ddc759-7xxp5 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-86f4ddc759-7xxp5 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-86f4ddc759-7xxp5?timeout=1m0s": dial tcp 38.102.83.194:6443: connect: connection refused Dec 02 09:25:17 crc kubenswrapper[4781]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 02 09:25:17 crc kubenswrapper[4781]: > pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:17 crc kubenswrapper[4781]: E1202 09:25:17.047955 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-86f4ddc759-7xxp5_openshift-authentication(7b66cd08-6bc4-48ce-9002-eb87e6ae27eb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-86f4ddc759-7xxp5_openshift-authentication(7b66cd08-6bc4-48ce-9002-eb87e6ae27eb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-86f4ddc759-7xxp5_openshift-authentication_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb_0(3eb3a11a198666ea7b2e9102ec53d9d28bd14942a5e35707b6073535b6d6e21b): error adding pod openshift-authentication_oauth-openshift-86f4ddc759-7xxp5 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"3eb3a11a198666ea7b2e9102ec53d9d28bd14942a5e35707b6073535b6d6e21b\\\" Netns:\\\"/var/run/netns/ed4c00e7-f5dd-42ae-8d5d-3b7fc921086f\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-86f4ddc759-7xxp5;K8S_POD_INFRA_CONTAINER_ID=3eb3a11a198666ea7b2e9102ec53d9d28bd14942a5e35707b6073535b6d6e21b;K8S_POD_UID=7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-86f4ddc759-7xxp5] networking: Multus: [openshift-authentication/oauth-openshift-86f4ddc759-7xxp5/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-86f4ddc759-7xxp5 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-86f4ddc759-7xxp5 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-86f4ddc759-7xxp5?timeout=1m0s\\\": dial tcp 38.102.83.194:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" podUID="7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" Dec 02 09:25:17 crc kubenswrapper[4781]: E1202 09:25:17.183355 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-8pgk8.187d5bc1aa064974 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-8pgk8,UID:c4734fd7-42d6-4b87-9160-5a3471f91d03,APIVersion:v1,ResourceVersion:28288,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\" in 25.829s (25.829s including waiting). Image size: 1129027903 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 09:25:16.338456948 +0000 UTC m=+279.162330817,LastTimestamp:2025-12-02 09:25:16.338456948 +0000 UTC m=+279.162330817,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.284523 4781 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.284587 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.507154 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.507571 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.508114 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.509088 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.509502 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.509740 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.509963 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.510226 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.510458 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.719974 4781 generic.go:334] "Generic (PLEG): container finished" podID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" containerID="12da46a80c1ab5d1ec3d83da4241389ab56aacfeabcef93a32503751f5f4a13d" exitCode=0 Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.720157 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t59n7" event={"ID":"0a0d43df-4480-4b62-bd3d-d129fbdcd722","Type":"ContainerDied","Data":"12da46a80c1ab5d1ec3d83da4241389ab56aacfeabcef93a32503751f5f4a13d"} Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.721224 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.721654 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.722092 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.722601 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.722868 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.723149 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.723370 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.724324 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.727087 4781 generic.go:334] "Generic (PLEG): container finished" podID="f37a1838-a130-4cb4-807f-2214cbfefdbf" containerID="efca497eaa2387782b2dcd379186015fe7ef578c950f2d847acc2e88bdda90f8" exitCode=0 Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.727129 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rfsl" event={"ID":"f37a1838-a130-4cb4-807f-2214cbfefdbf","Type":"ContainerDied","Data":"efca497eaa2387782b2dcd379186015fe7ef578c950f2d847acc2e88bdda90f8"} Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.731504 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.731852 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.732197 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.732665 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.732902 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.733110 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.733630 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.733858 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.740416 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.749667 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"a0c690fa3b5f0b02ca2432837f85440390764d9c722105890e90188c5b4a9baa"} Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.754636 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.754984 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.755178 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.755360 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.755540 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.756529 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.757272 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.757714 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.765936 4781 generic.go:334] "Generic (PLEG): container finished" podID="c4734fd7-42d6-4b87-9160-5a3471f91d03" containerID="ec9535118e1e08b778718f3770ed8a9a99709a23f3d4ed856e36a491596ff221" exitCode=0 Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.766245 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pgk8" event={"ID":"c4734fd7-42d6-4b87-9160-5a3471f91d03","Type":"ContainerDied","Data":"ec9535118e1e08b778718f3770ed8a9a99709a23f3d4ed856e36a491596ff221"} Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.767987 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.768588 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.769016 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.769344 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.769550 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.769747 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.771307 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:17 crc kubenswrapper[4781]: I1202 09:25:17.771749 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.015790 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.017085 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.017812 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.018311 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.018620 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.018898 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.019180 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.019393 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.019572 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.019740 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.020030 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.020448 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.020989 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.021238 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.021560 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.021799 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.022027 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.022252 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.022528 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.022742 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.022912 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.059419 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5df92e64-6fc4-4395-a919-b1c46c30d318-kubelet-dir\") pod \"5df92e64-6fc4-4395-a919-b1c46c30d318\" (UID: \"5df92e64-6fc4-4395-a919-b1c46c30d318\") " Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.059516 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5df92e64-6fc4-4395-a919-b1c46c30d318-var-lock\") pod \"5df92e64-6fc4-4395-a919-b1c46c30d318\" (UID: \"5df92e64-6fc4-4395-a919-b1c46c30d318\") " Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.059574 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5df92e64-6fc4-4395-a919-b1c46c30d318-var-lock" (OuterVolumeSpecName: "var-lock") pod "5df92e64-6fc4-4395-a919-b1c46c30d318" (UID: "5df92e64-6fc4-4395-a919-b1c46c30d318"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.059595 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.059572 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5df92e64-6fc4-4395-a919-b1c46c30d318-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5df92e64-6fc4-4395-a919-b1c46c30d318" (UID: "5df92e64-6fc4-4395-a919-b1c46c30d318"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.059636 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.059670 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.059683 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.059701 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5df92e64-6fc4-4395-a919-b1c46c30d318-kube-api-access\") pod \"5df92e64-6fc4-4395-a919-b1c46c30d318\" (UID: \"5df92e64-6fc4-4395-a919-b1c46c30d318\") " Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.059708 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.059730 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.059962 4781 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.059980 4781 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.059993 4781 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.060006 4781 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5df92e64-6fc4-4395-a919-b1c46c30d318-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.060018 4781 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5df92e64-6fc4-4395-a919-b1c46c30d318-var-lock\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.064857 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df92e64-6fc4-4395-a919-b1c46c30d318-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5df92e64-6fc4-4395-a919-b1c46c30d318" (UID: "5df92e64-6fc4-4395-a919-b1c46c30d318"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.160719 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5df92e64-6fc4-4395-a919-b1c46c30d318-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.772930 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5df92e64-6fc4-4395-a919-b1c46c30d318","Type":"ContainerDied","Data":"3853e7d7ea4f932f4c9b110c3f667818bacfafb4fe6738a0a513c17759aae332"} Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.772974 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.772979 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3853e7d7ea4f932f4c9b110c3f667818bacfafb4fe6738a0a513c17759aae332" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.779487 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pgk8" event={"ID":"c4734fd7-42d6-4b87-9160-5a3471f91d03","Type":"ContainerStarted","Data":"1822aaa0b48e249d15a786a4f2d85c035ab1f28c209a58ddf4da57e8438e4e56"} Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.780158 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.780454 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.780702 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.780900 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.781131 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.781280 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t59n7" event={"ID":"0a0d43df-4480-4b62-bd3d-d129fbdcd722","Type":"ContainerStarted","Data":"c067d66a79d99aa5bd42a28b115e25c993de6df62c915afd46daccbbdf6f2ad9"} Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.781354 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.781578 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.781797 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.782065 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.782366 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.782587 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.782885 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.783461 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.783555 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rfsl" event={"ID":"f37a1838-a130-4cb4-807f-2214cbfefdbf","Type":"ContainerStarted","Data":"364f1edb119e7c0e66eb6574c505409a5e4b5e9cd9107277ad4c026b1dcf7ac8"} Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.783766 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.784200 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.784456 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.784666 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.786138 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.786677 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.786833 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.787044 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.787607 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.787854 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.787861 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.788157 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.788382 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.788511 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d" exitCode=0 Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.788619 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.788822 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.789161 4781 scope.go:117] "RemoveContainer" containerID="5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.789172 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.789169 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.789426 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.789699 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.789857 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.790130 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.790346 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.790534 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.790750 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.791001 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.803807 4781 scope.go:117] "RemoveContainer" containerID="70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.805134 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.805340 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.805593 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.805842 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.806030 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.806222 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.806464 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.806776 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.806970 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.819129 4781 scope.go:117] "RemoveContainer" containerID="0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.835271 4781 scope.go:117] "RemoveContainer" containerID="0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.847368 4781 scope.go:117] "RemoveContainer" containerID="08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.865241 4781 scope.go:117] "RemoveContainer" containerID="3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.908152 4781 scope.go:117] "RemoveContainer" containerID="5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa" Dec 02 09:25:18 crc kubenswrapper[4781]: E1202 09:25:18.909230 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\": container with ID starting with 5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa not found: ID does not exist" containerID="5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.909278 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa"} err="failed to get container status \"5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\": rpc error: code = NotFound desc = could not find container \"5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa\": container with ID starting with 5356a78651fc6cb4214e10b8b25fd20d7f6acb739abdcf2e1de6dd666a31edaa not found: ID does not exist" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.909306 4781 scope.go:117] "RemoveContainer" containerID="70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8" Dec 02 09:25:18 crc kubenswrapper[4781]: E1202 09:25:18.910315 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\": container with ID starting with 70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8 not found: ID does not exist" containerID="70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.910358 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8"} err="failed to get container status \"70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\": rpc error: code = NotFound desc = could not find container \"70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8\": container with ID starting with 70a99d096eb7cf9386a857100ae0faff8a3b17140eaada581e5511a88aadc0f8 not found: ID does not exist" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.910389 4781 scope.go:117] "RemoveContainer" containerID="0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719" Dec 02 09:25:18 crc kubenswrapper[4781]: E1202 09:25:18.910769 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\": container with ID starting with 0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719 not found: ID does not exist" containerID="0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.910808 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719"} err="failed to get container status \"0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\": rpc error: code = NotFound desc = could not find container \"0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719\": container with ID starting with 0090e6856a6d66e284f4f4eecb2d193720318f33a0260d470a50af72c84f6719 not found: ID does not exist" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.910835 4781 scope.go:117] "RemoveContainer" containerID="0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb" Dec 02 09:25:18 crc kubenswrapper[4781]: E1202 09:25:18.913628 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\": container with ID starting with 0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb not found: ID does not exist" containerID="0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.913664 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb"} err="failed to get container status \"0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\": rpc error: code = NotFound desc = could not find container \"0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb\": container with ID starting with 0179b61af209d72b1e1a79835f2819b3af773ab99b0c431733beba8a4dcf2eeb not found: ID does not exist" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.913683 4781 scope.go:117] "RemoveContainer" containerID="08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d" Dec 02 09:25:18 crc kubenswrapper[4781]: E1202 09:25:18.915163 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\": container with ID starting with 08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d not found: ID does not exist" containerID="08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.915212 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d"} err="failed to get container status \"08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\": rpc error: code = NotFound desc = could not find container \"08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d\": container with ID starting with 08c7d87bc390c31adc6d7412985f5bbfe535ea6dccb586f9be2823a17bce3e2d not found: ID does not exist" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.915243 4781 scope.go:117] "RemoveContainer" containerID="3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34" Dec 02 09:25:18 crc kubenswrapper[4781]: E1202 09:25:18.915797 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\": container with ID starting with 3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34 not found: ID does not exist" containerID="3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34" Dec 02 09:25:18 crc kubenswrapper[4781]: I1202 09:25:18.915851 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34"} err="failed to get container status \"3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\": rpc error: code = NotFound desc = could not find container \"3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34\": container with ID starting with 3f62de9d8a69a7c1a3024f32b3c940555bc4dcac007533d9b66a648ca86dbb34 not found: ID does not exist" Dec 02 09:25:19 crc kubenswrapper[4781]: I1202 09:25:19.511164 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 02 09:25:24 crc kubenswrapper[4781]: E1202 09:25:23.723558 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:24 crc kubenswrapper[4781]: E1202 09:25:23.724271 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:24 crc kubenswrapper[4781]: E1202 09:25:23.724909 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:24 crc kubenswrapper[4781]: E1202 09:25:23.725586 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:24 crc kubenswrapper[4781]: E1202 09:25:23.726004 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:24 crc kubenswrapper[4781]: I1202 09:25:23.726035 4781 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 02 09:25:24 crc kubenswrapper[4781]: E1202 09:25:23.726254 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="200ms" Dec 02 09:25:24 crc kubenswrapper[4781]: E1202 09:25:23.927443 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="400ms" Dec 02 09:25:24 crc kubenswrapper[4781]: E1202 09:25:24.329690 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="800ms" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.006824 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-844r4" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.007156 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-844r4" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.033787 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t59n7" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.033829 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t59n7" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.045683 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-844r4" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.046273 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.046469 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.046620 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.046763 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.046903 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.047361 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.047764 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.047981 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.074618 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t59n7" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.075177 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.075561 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.075791 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.076061 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.076281 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.076618 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.076812 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.078377 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: E1202 09:25:25.130758 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="1.6s" Dec 02 09:25:25 crc kubenswrapper[4781]: E1202 09:25:25.202472 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:25:25Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:25:25Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:25:25Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T09:25:25Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:199fc40b10ddf12906810b78738c7baeb4fa3fe1305e603b2b85148deb79be5f\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:bc4faffed52e146b74d33262b7def61a5bca011efe5d9d0a77124902092bafca\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1607793480},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:61d26ea7a26593e06b479225c8fc57ff8d82b6d27076fb80529751fe70b54a64\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:bb24465fd94d9154042e9a420d8ddc1a6ec16f1e59158f9b8c99df7f98dd3029\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1203933014},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:4e5ccf665445579fd47e4fd366d96171d7423e5843d80f82b5bef20ff3d786e0\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:bc8bff7f691d6f08178b6ec928e2e47d115faf7669f7721682b6ce7f7c7267bd\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201255754},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: E1202 09:25:25.202969 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: E1202 09:25:25.203337 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: E1202 09:25:25.203690 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: E1202 09:25:25.203951 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: E1202 09:25:25.203975 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.859671 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-844r4" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.860377 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.860882 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.861283 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.861578 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.861839 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.862204 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.862500 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.862768 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.866097 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t59n7" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.866527 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.866766 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.867035 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.867285 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.867520 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.867785 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.868109 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:25 crc kubenswrapper[4781]: I1202 09:25:25.868331 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.365051 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8pgk8" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.365099 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8pgk8" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.412659 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8pgk8" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.413083 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.413309 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.413515 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.413736 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.413967 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.414160 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.414352 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.414557 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.499358 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.500026 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.500359 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.500600 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.500862 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.501154 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.501431 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.501699 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.501913 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.511816 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ba607959-82d4-440b-b830-95eb9584db6f" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.511838 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ba607959-82d4-440b-b830-95eb9584db6f" Dec 02 09:25:26 crc kubenswrapper[4781]: E1202 09:25:26.512128 4781 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.512625 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:26 crc kubenswrapper[4781]: W1202 09:25:26.533145 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-4ba74a69ea9529bd88d83dbf474b0d144bda450b239674571df54456d4e5a1d5 WatchSource:0}: Error finding container 4ba74a69ea9529bd88d83dbf474b0d144bda450b239674571df54456d4e5a1d5: Status 404 returned error can't find the container with id 4ba74a69ea9529bd88d83dbf474b0d144bda450b239674571df54456d4e5a1d5 Dec 02 09:25:26 crc kubenswrapper[4781]: E1202 09:25:26.732424 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="3.2s" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.831208 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4ba74a69ea9529bd88d83dbf474b0d144bda450b239674571df54456d4e5a1d5"} Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.890706 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8pgk8" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.891481 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.891890 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.892179 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.892488 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.892976 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.893261 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.893554 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:26 crc kubenswrapper[4781]: I1202 09:25:26.893898 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: E1202 09:25:27.184755 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-8pgk8.187d5bc1aa064974 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-8pgk8,UID:c4734fd7-42d6-4b87-9160-5a3471f91d03,APIVersion:v1,ResourceVersion:28288,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\" in 25.829s (25.829s including waiting). Image size: 1129027903 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 09:25:16.338456948 +0000 UTC m=+279.162330817,LastTimestamp:2025-12-02 09:25:16.338456948 +0000 UTC m=+279.162330817,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.498914 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.514451 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.514667 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.514998 4781 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.515506 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.516094 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.516541 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.516973 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.517391 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.517850 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.518382 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.792673 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4rfsl" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.793223 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4rfsl" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.843650 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4rfsl" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.844744 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.845140 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.845632 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.845899 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.846225 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.846731 4781 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.847185 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.847530 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.847870 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.891612 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4rfsl" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.892054 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.892333 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.892555 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.892715 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.892869 4781 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.893083 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.893261 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.893446 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:27 crc kubenswrapper[4781]: I1202 09:25:27.893617 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:29 crc kubenswrapper[4781]: E1202 09:25:29.033876 4781 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 02 09:25:29 crc kubenswrapper[4781]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-86f4ddc759-7xxp5_openshift-authentication_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb_0(3df7add7c86f1e82a5f73c115f085dd32d258be3d8070399ccdb649e0f691971): error adding pod openshift-authentication_oauth-openshift-86f4ddc759-7xxp5 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3df7add7c86f1e82a5f73c115f085dd32d258be3d8070399ccdb649e0f691971" Netns:"/var/run/netns/1aa5ceee-25bc-4727-bcae-c02adad79d22" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-86f4ddc759-7xxp5;K8S_POD_INFRA_CONTAINER_ID=3df7add7c86f1e82a5f73c115f085dd32d258be3d8070399ccdb649e0f691971;K8S_POD_UID=7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-86f4ddc759-7xxp5] networking: Multus: [openshift-authentication/oauth-openshift-86f4ddc759-7xxp5/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-86f4ddc759-7xxp5 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-86f4ddc759-7xxp5 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-86f4ddc759-7xxp5?timeout=1m0s": dial tcp 38.102.83.194:6443: connect: connection refused Dec 02 09:25:29 crc kubenswrapper[4781]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 02 09:25:29 crc kubenswrapper[4781]: > Dec 02 09:25:29 crc kubenswrapper[4781]: E1202 09:25:29.034416 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 02 09:25:29 crc kubenswrapper[4781]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-86f4ddc759-7xxp5_openshift-authentication_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb_0(3df7add7c86f1e82a5f73c115f085dd32d258be3d8070399ccdb649e0f691971): error adding pod openshift-authentication_oauth-openshift-86f4ddc759-7xxp5 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3df7add7c86f1e82a5f73c115f085dd32d258be3d8070399ccdb649e0f691971" Netns:"/var/run/netns/1aa5ceee-25bc-4727-bcae-c02adad79d22" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-86f4ddc759-7xxp5;K8S_POD_INFRA_CONTAINER_ID=3df7add7c86f1e82a5f73c115f085dd32d258be3d8070399ccdb649e0f691971;K8S_POD_UID=7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-86f4ddc759-7xxp5] networking: Multus: [openshift-authentication/oauth-openshift-86f4ddc759-7xxp5/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-86f4ddc759-7xxp5 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-86f4ddc759-7xxp5 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-86f4ddc759-7xxp5?timeout=1m0s": dial tcp 38.102.83.194:6443: connect: connection refused Dec 02 09:25:29 crc kubenswrapper[4781]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 02 09:25:29 crc kubenswrapper[4781]: > pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:29 crc kubenswrapper[4781]: E1202 09:25:29.034454 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 02 09:25:29 crc kubenswrapper[4781]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-86f4ddc759-7xxp5_openshift-authentication_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb_0(3df7add7c86f1e82a5f73c115f085dd32d258be3d8070399ccdb649e0f691971): error adding pod openshift-authentication_oauth-openshift-86f4ddc759-7xxp5 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3df7add7c86f1e82a5f73c115f085dd32d258be3d8070399ccdb649e0f691971" Netns:"/var/run/netns/1aa5ceee-25bc-4727-bcae-c02adad79d22" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-86f4ddc759-7xxp5;K8S_POD_INFRA_CONTAINER_ID=3df7add7c86f1e82a5f73c115f085dd32d258be3d8070399ccdb649e0f691971;K8S_POD_UID=7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-86f4ddc759-7xxp5] networking: Multus: [openshift-authentication/oauth-openshift-86f4ddc759-7xxp5/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-86f4ddc759-7xxp5 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-86f4ddc759-7xxp5 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-86f4ddc759-7xxp5?timeout=1m0s": dial tcp 38.102.83.194:6443: connect: connection refused Dec 02 09:25:29 crc kubenswrapper[4781]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 02 09:25:29 crc kubenswrapper[4781]: > pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:29 crc kubenswrapper[4781]: E1202 09:25:29.034551 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-86f4ddc759-7xxp5_openshift-authentication(7b66cd08-6bc4-48ce-9002-eb87e6ae27eb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-86f4ddc759-7xxp5_openshift-authentication(7b66cd08-6bc4-48ce-9002-eb87e6ae27eb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-86f4ddc759-7xxp5_openshift-authentication_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb_0(3df7add7c86f1e82a5f73c115f085dd32d258be3d8070399ccdb649e0f691971): error adding pod openshift-authentication_oauth-openshift-86f4ddc759-7xxp5 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"3df7add7c86f1e82a5f73c115f085dd32d258be3d8070399ccdb649e0f691971\\\" Netns:\\\"/var/run/netns/1aa5ceee-25bc-4727-bcae-c02adad79d22\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-86f4ddc759-7xxp5;K8S_POD_INFRA_CONTAINER_ID=3df7add7c86f1e82a5f73c115f085dd32d258be3d8070399ccdb649e0f691971;K8S_POD_UID=7b66cd08-6bc4-48ce-9002-eb87e6ae27eb\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-86f4ddc759-7xxp5] networking: Multus: [openshift-authentication/oauth-openshift-86f4ddc759-7xxp5/7b66cd08-6bc4-48ce-9002-eb87e6ae27eb]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-86f4ddc759-7xxp5 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-86f4ddc759-7xxp5 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-86f4ddc759-7xxp5?timeout=1m0s\\\": dial tcp 38.102.83.194:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" podUID="7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" Dec 02 09:25:29 crc kubenswrapper[4781]: I1202 09:25:29.160678 4781 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 02 09:25:29 crc kubenswrapper[4781]: I1202 09:25:29.160751 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 02 09:25:29 crc kubenswrapper[4781]: E1202 09:25:29.934113 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="6.4s" Dec 02 09:25:31 crc kubenswrapper[4781]: I1202 09:25:31.058159 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d11453ae50b7856124894ed02e2a395d796321e91d24f505e7f5284c16136fcb"} Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.065441 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.065578 4781 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930" exitCode=1 Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.065807 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ba607959-82d4-440b-b830-95eb9584db6f" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.065819 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ba607959-82d4-440b-b830-95eb9584db6f" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.066178 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930"} Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.066637 4781 scope.go:117] "RemoveContainer" containerID="99c09074e0829703890129b7052956f56e1a93cad8d1bcf9e480b3a7277ae930" Dec 02 09:25:32 crc kubenswrapper[4781]: E1202 09:25:32.066898 4781 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.067875 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.068315 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.068572 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.068799 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.069068 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.069291 4781 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.069535 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.069754 4781 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.070008 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.070214 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.071262 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.071759 4781 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.071987 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.072193 4781 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.072376 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.072566 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.072749 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.072949 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.073216 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:32 crc kubenswrapper[4781]: I1202 09:25:32.073555 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:33 crc kubenswrapper[4781]: I1202 09:25:33.072905 4781 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d11453ae50b7856124894ed02e2a395d796321e91d24f505e7f5284c16136fcb" exitCode=0 Dec 02 09:25:33 crc kubenswrapper[4781]: I1202 09:25:33.072968 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d11453ae50b7856124894ed02e2a395d796321e91d24f505e7f5284c16136fcb"} Dec 02 09:25:33 crc kubenswrapper[4781]: I1202 09:25:33.073681 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ba607959-82d4-440b-b830-95eb9584db6f" Dec 02 09:25:33 crc kubenswrapper[4781]: I1202 09:25:33.073696 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ba607959-82d4-440b-b830-95eb9584db6f" Dec 02 09:25:33 crc kubenswrapper[4781]: I1202 09:25:33.074170 4781 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:33 crc kubenswrapper[4781]: E1202 09:25:33.074203 4781 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:33 crc kubenswrapper[4781]: I1202 09:25:33.074458 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:33 crc kubenswrapper[4781]: I1202 09:25:33.075088 4781 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:33 crc kubenswrapper[4781]: I1202 09:25:33.075410 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:33 crc kubenswrapper[4781]: I1202 09:25:33.075623 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:33 crc kubenswrapper[4781]: I1202 09:25:33.075886 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:33 crc kubenswrapper[4781]: I1202 09:25:33.076190 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:33 crc kubenswrapper[4781]: I1202 09:25:33.076514 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:33 crc kubenswrapper[4781]: I1202 09:25:33.076892 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:33 crc kubenswrapper[4781]: I1202 09:25:33.077068 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 09:25:33 crc kubenswrapper[4781]: I1202 09:25:33.077110 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1979738e802d22d335479ecb0d841194ceb358aceb8ef1783763cbd908ef8491"} Dec 02 09:25:33 crc kubenswrapper[4781]: I1202 09:25:33.077110 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:33 crc kubenswrapper[4781]: I1202 09:25:33.776002 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:25:34 crc kubenswrapper[4781]: I1202 09:25:34.083806 4781 status_manager.go:851] "Failed to get status for pod" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" pod="openshift-marketplace/certified-operators-t59n7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t59n7\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:34 crc kubenswrapper[4781]: I1202 09:25:34.084366 4781 status_manager.go:851] "Failed to get status for pod" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" pod="openshift-marketplace/redhat-marketplace-8pgk8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8pgk8\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:34 crc kubenswrapper[4781]: I1202 09:25:34.084774 4781 status_manager.go:851] "Failed to get status for pod" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" pod="openshift-marketplace/redhat-marketplace-8d48v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8d48v\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:34 crc kubenswrapper[4781]: I1202 09:25:34.085110 4781 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:34 crc kubenswrapper[4781]: I1202 09:25:34.085325 4781 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:34 crc kubenswrapper[4781]: I1202 09:25:34.085505 4781 status_manager.go:851] "Failed to get status for pod" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" pod="openshift-marketplace/certified-operators-844r4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-844r4\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:34 crc kubenswrapper[4781]: I1202 09:25:34.086000 4781 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:34 crc kubenswrapper[4781]: I1202 09:25:34.086338 4781 status_manager.go:851] "Failed to get status for pod" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" pod="openshift-authentication/oauth-openshift-558db77b4-xqr9t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xqr9t\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:34 crc kubenswrapper[4781]: I1202 09:25:34.086638 4781 status_manager.go:851] "Failed to get status for pod" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:34 crc kubenswrapper[4781]: I1202 09:25:34.086885 4781 status_manager.go:851] "Failed to get status for pod" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" pod="openshift-marketplace/redhat-operators-4rfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4rfsl\": dial tcp 38.102.83.194:6443: connect: connection refused" Dec 02 09:25:35 crc kubenswrapper[4781]: I1202 09:25:35.091651 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"54e90ee7e166bc5e6a020d781c3053d914df0628b786e4ce935dee2afb0b17fe"} Dec 02 09:25:35 crc kubenswrapper[4781]: I1202 09:25:35.091946 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3c60b454c2f130e525aaaec094d04002898beaca3f8982bf1cb148e73ca4e622"} Dec 02 09:25:36 crc kubenswrapper[4781]: I1202 09:25:36.099346 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"37b3ed25478aad768f6030f46084e42487a3edb30190789b659e692a774e42c9"} Dec 02 09:25:36 crc kubenswrapper[4781]: I1202 09:25:36.100599 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1aacf3e779054767f6c2cc71088ef613dc9c6d52b02cf6c98c2cb7194d52c835"} Dec 02 09:25:36 crc kubenswrapper[4781]: I1202 09:25:36.142586 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:25:36 crc kubenswrapper[4781]: I1202 09:25:36.147079 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:25:37 crc kubenswrapper[4781]: I1202 09:25:37.106769 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:25:37 crc kubenswrapper[4781]: I1202 09:25:37.317112 4781 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 02 09:25:38 crc kubenswrapper[4781]: I1202 09:25:38.116955 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"dfade0f372b8d1881d957cd4d1d22f068a1521752cbdb15cde9acdc2861b485f"} Dec 02 09:25:38 crc kubenswrapper[4781]: I1202 09:25:38.117106 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:38 crc kubenswrapper[4781]: I1202 09:25:38.117209 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ba607959-82d4-440b-b830-95eb9584db6f" Dec 02 09:25:38 crc kubenswrapper[4781]: I1202 09:25:38.117231 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ba607959-82d4-440b-b830-95eb9584db6f" Dec 02 09:25:38 crc kubenswrapper[4781]: I1202 09:25:38.123465 4781 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:39 crc kubenswrapper[4781]: I1202 09:25:39.121916 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ba607959-82d4-440b-b830-95eb9584db6f" Dec 02 09:25:39 crc kubenswrapper[4781]: I1202 09:25:39.121988 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ba607959-82d4-440b-b830-95eb9584db6f" Dec 02 09:25:41 crc kubenswrapper[4781]: I1202 09:25:41.263696 4781 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="0c5b2e20-f592-4d80-9f02-ec47f8ccb91f" Dec 02 09:25:44 crc kubenswrapper[4781]: I1202 09:25:44.499270 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:44 crc kubenswrapper[4781]: I1202 09:25:44.500172 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:45 crc kubenswrapper[4781]: I1202 09:25:45.155407 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" event={"ID":"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb","Type":"ContainerStarted","Data":"bfee1b0d41eb22f640cb84c102fe00f5f0beee3eed36bbe8a3dcf5b9cc4bd2d9"} Dec 02 09:25:45 crc kubenswrapper[4781]: I1202 09:25:45.155697 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" event={"ID":"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb","Type":"ContainerStarted","Data":"168b699bb739acd092d31e23b79093329caafee4a3911080d3ba4e77ff795ca2"} Dec 02 09:25:46 crc kubenswrapper[4781]: I1202 09:25:46.161132 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-86f4ddc759-7xxp5_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb/oauth-openshift/0.log" Dec 02 09:25:46 crc kubenswrapper[4781]: I1202 09:25:46.161431 4781 generic.go:334] "Generic (PLEG): container finished" podID="7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" containerID="bfee1b0d41eb22f640cb84c102fe00f5f0beee3eed36bbe8a3dcf5b9cc4bd2d9" exitCode=255 Dec 02 09:25:46 crc kubenswrapper[4781]: I1202 09:25:46.161462 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" event={"ID":"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb","Type":"ContainerDied","Data":"bfee1b0d41eb22f640cb84c102fe00f5f0beee3eed36bbe8a3dcf5b9cc4bd2d9"} Dec 02 09:25:46 crc kubenswrapper[4781]: I1202 09:25:46.161857 4781 scope.go:117] "RemoveContainer" containerID="bfee1b0d41eb22f640cb84c102fe00f5f0beee3eed36bbe8a3dcf5b9cc4bd2d9" Dec 02 09:25:47 crc kubenswrapper[4781]: I1202 09:25:47.172855 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-86f4ddc759-7xxp5_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb/oauth-openshift/0.log" Dec 02 09:25:47 crc kubenswrapper[4781]: I1202 09:25:47.173531 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" event={"ID":"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb","Type":"ContainerStarted","Data":"f0753cc6a646710f13d715b139f1f82a3b1ffb329b90856118103df28a591829"} Dec 02 09:25:47 crc kubenswrapper[4781]: I1202 09:25:47.174093 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:47 crc kubenswrapper[4781]: I1202 09:25:47.176024 4781 patch_prober.go:28] interesting pod/oauth-openshift-86f4ddc759-7xxp5 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" start-of-body= Dec 02 09:25:47 crc kubenswrapper[4781]: I1202 09:25:47.176104 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" podUID="7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" Dec 02 09:25:48 crc kubenswrapper[4781]: I1202 09:25:48.180957 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-86f4ddc759-7xxp5_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb/oauth-openshift/1.log" Dec 02 09:25:48 crc kubenswrapper[4781]: I1202 09:25:48.181640 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-86f4ddc759-7xxp5_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb/oauth-openshift/0.log" Dec 02 09:25:48 crc kubenswrapper[4781]: I1202 09:25:48.181678 4781 generic.go:334] "Generic (PLEG): container finished" podID="7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" containerID="f0753cc6a646710f13d715b139f1f82a3b1ffb329b90856118103df28a591829" exitCode=255 Dec 02 09:25:48 crc kubenswrapper[4781]: I1202 09:25:48.181703 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" event={"ID":"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb","Type":"ContainerDied","Data":"f0753cc6a646710f13d715b139f1f82a3b1ffb329b90856118103df28a591829"} Dec 02 09:25:48 crc kubenswrapper[4781]: I1202 09:25:48.181733 4781 scope.go:117] "RemoveContainer" containerID="bfee1b0d41eb22f640cb84c102fe00f5f0beee3eed36bbe8a3dcf5b9cc4bd2d9" Dec 02 09:25:48 crc kubenswrapper[4781]: I1202 09:25:48.182360 4781 scope.go:117] "RemoveContainer" containerID="f0753cc6a646710f13d715b139f1f82a3b1ffb329b90856118103df28a591829" Dec 02 09:25:48 crc kubenswrapper[4781]: E1202 09:25:48.182618 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-86f4ddc759-7xxp5_openshift-authentication(7b66cd08-6bc4-48ce-9002-eb87e6ae27eb)\"" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" podUID="7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" Dec 02 09:25:49 crc kubenswrapper[4781]: I1202 09:25:49.164031 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 09:25:49 crc kubenswrapper[4781]: I1202 09:25:49.189837 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-86f4ddc759-7xxp5_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb/oauth-openshift/1.log" Dec 02 09:25:49 crc kubenswrapper[4781]: I1202 09:25:49.190884 4781 scope.go:117] "RemoveContainer" containerID="f0753cc6a646710f13d715b139f1f82a3b1ffb329b90856118103df28a591829" Dec 02 09:25:49 crc kubenswrapper[4781]: E1202 09:25:49.191222 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-86f4ddc759-7xxp5_openshift-authentication(7b66cd08-6bc4-48ce-9002-eb87e6ae27eb)\"" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" podUID="7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" Dec 02 09:25:50 crc kubenswrapper[4781]: I1202 09:25:50.254248 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 02 09:25:51 crc kubenswrapper[4781]: I1202 09:25:51.755237 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 02 09:25:51 crc kubenswrapper[4781]: I1202 09:25:51.939593 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 02 09:25:52 crc kubenswrapper[4781]: I1202 09:25:52.111210 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 02 09:25:52 crc kubenswrapper[4781]: I1202 09:25:52.130984 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 02 09:25:52 crc kubenswrapper[4781]: I1202 09:25:52.755176 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 09:25:52 crc kubenswrapper[4781]: I1202 09:25:52.911183 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 02 09:25:53 crc kubenswrapper[4781]: I1202 09:25:53.419466 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 02 09:25:53 crc kubenswrapper[4781]: I1202 09:25:53.419869 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 02 09:25:53 crc kubenswrapper[4781]: I1202 09:25:53.495010 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 02 09:25:53 crc kubenswrapper[4781]: I1202 09:25:53.584723 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 02 09:25:53 crc kubenswrapper[4781]: I1202 09:25:53.591575 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 02 09:25:53 crc kubenswrapper[4781]: I1202 09:25:53.731546 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 02 09:25:53 crc kubenswrapper[4781]: I1202 09:25:53.888662 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 02 09:25:53 crc kubenswrapper[4781]: I1202 09:25:53.916257 4781 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 02 09:25:54 crc kubenswrapper[4781]: I1202 09:25:54.165248 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 02 09:25:54 crc kubenswrapper[4781]: I1202 09:25:54.201615 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 02 09:25:54 crc kubenswrapper[4781]: I1202 09:25:54.271578 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 02 09:25:54 crc kubenswrapper[4781]: I1202 09:25:54.361022 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 02 09:25:54 crc kubenswrapper[4781]: I1202 09:25:54.363335 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 02 09:25:54 crc kubenswrapper[4781]: I1202 09:25:54.433557 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 02 09:25:54 crc kubenswrapper[4781]: I1202 09:25:54.465419 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 02 09:25:54 crc kubenswrapper[4781]: I1202 09:25:54.547263 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 02 09:25:54 crc kubenswrapper[4781]: I1202 09:25:54.628673 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 02 09:25:54 crc kubenswrapper[4781]: I1202 09:25:54.675132 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 09:25:54 crc kubenswrapper[4781]: I1202 09:25:54.677531 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 02 09:25:54 crc kubenswrapper[4781]: I1202 09:25:54.731118 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 02 09:25:54 crc kubenswrapper[4781]: I1202 09:25:54.846604 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 02 09:25:54 crc kubenswrapper[4781]: I1202 09:25:54.850763 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 02 09:25:55 crc kubenswrapper[4781]: I1202 09:25:55.036395 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 02 09:25:55 crc kubenswrapper[4781]: I1202 09:25:55.036788 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 02 09:25:55 crc kubenswrapper[4781]: I1202 09:25:55.137979 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 02 09:25:55 crc kubenswrapper[4781]: I1202 09:25:55.193784 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:55 crc kubenswrapper[4781]: I1202 09:25:55.195422 4781 scope.go:117] "RemoveContainer" containerID="f0753cc6a646710f13d715b139f1f82a3b1ffb329b90856118103df28a591829" Dec 02 09:25:55 crc kubenswrapper[4781]: E1202 09:25:55.195646 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-86f4ddc759-7xxp5_openshift-authentication(7b66cd08-6bc4-48ce-9002-eb87e6ae27eb)\"" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" podUID="7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" Dec 02 09:25:55 crc kubenswrapper[4781]: I1202 09:25:55.254342 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 02 09:25:55 crc kubenswrapper[4781]: I1202 09:25:55.308033 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 02 09:25:55 crc kubenswrapper[4781]: I1202 09:25:55.344396 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 02 09:25:55 crc kubenswrapper[4781]: I1202 09:25:55.394179 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 09:25:55 crc kubenswrapper[4781]: I1202 09:25:55.419050 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 02 09:25:55 crc kubenswrapper[4781]: I1202 09:25:55.586644 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 02 09:25:55 crc kubenswrapper[4781]: I1202 09:25:55.624862 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 02 09:25:55 crc kubenswrapper[4781]: I1202 09:25:55.662162 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 09:25:55 crc kubenswrapper[4781]: I1202 09:25:55.731065 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 02 09:25:55 crc kubenswrapper[4781]: I1202 09:25:55.882607 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 02 09:25:55 crc kubenswrapper[4781]: I1202 09:25:55.894154 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 02 09:25:55 crc kubenswrapper[4781]: I1202 09:25:55.964418 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 02 09:25:56 crc kubenswrapper[4781]: I1202 09:25:56.006535 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 02 09:25:56 crc kubenswrapper[4781]: I1202 09:25:56.090676 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 02 09:25:56 crc kubenswrapper[4781]: I1202 09:25:56.093474 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 02 09:25:56 crc kubenswrapper[4781]: I1202 09:25:56.167730 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 02 09:25:56 crc kubenswrapper[4781]: I1202 09:25:56.227973 4781 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 02 09:25:56 crc kubenswrapper[4781]: I1202 09:25:56.355580 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 02 09:25:56 crc kubenswrapper[4781]: I1202 09:25:56.522733 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 09:25:56 crc kubenswrapper[4781]: I1202 09:25:56.650455 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 02 09:25:56 crc kubenswrapper[4781]: I1202 09:25:56.726490 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 09:25:56 crc kubenswrapper[4781]: I1202 09:25:56.727276 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 02 09:25:56 crc kubenswrapper[4781]: I1202 09:25:56.727637 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 02 09:25:56 crc kubenswrapper[4781]: I1202 09:25:56.737719 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 02 09:25:56 crc kubenswrapper[4781]: I1202 09:25:56.802080 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.072658 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.233725 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.272197 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.292996 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.305140 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.366701 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.367553 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.477435 4781 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.477688 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=42.477662027 podStartE2EDuration="42.477662027s" podCreationTimestamp="2025-12-02 09:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:25:41.2273845 +0000 UTC m=+304.051258379" watchObservedRunningTime="2025-12-02 09:25:57.477662027 +0000 UTC m=+320.301535906" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.477903 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-844r4" podStartSLOduration=45.84748804 podStartE2EDuration="2m53.477896172s" podCreationTimestamp="2025-12-02 09:23:04 +0000 UTC" firstStartedPulling="2025-12-02 09:23:08.748844204 +0000 UTC m=+151.572718083" lastFinishedPulling="2025-12-02 09:25:16.379252336 +0000 UTC m=+279.203126215" observedRunningTime="2025-12-02 09:25:41.26051305 +0000 UTC m=+304.084386939" watchObservedRunningTime="2025-12-02 09:25:57.477896172 +0000 UTC m=+320.301770051" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.478019 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t59n7" podStartSLOduration=44.086395141 podStartE2EDuration="2m53.478014995s" podCreationTimestamp="2025-12-02 09:23:04 +0000 UTC" firstStartedPulling="2025-12-02 09:23:08.732735453 +0000 UTC m=+151.556609332" lastFinishedPulling="2025-12-02 09:25:18.124355307 +0000 UTC m=+280.948229186" observedRunningTime="2025-12-02 09:25:41.173712926 +0000 UTC m=+303.997586805" watchObservedRunningTime="2025-12-02 09:25:57.478014995 +0000 UTC m=+320.301888874" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.479040 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4rfsl" podStartSLOduration=42.06894378 podStartE2EDuration="2m50.479036189s" podCreationTimestamp="2025-12-02 09:23:07 +0000 UTC" firstStartedPulling="2025-12-02 09:23:09.742178349 +0000 UTC m=+152.566052228" lastFinishedPulling="2025-12-02 09:25:18.152270748 +0000 UTC m=+280.976144637" observedRunningTime="2025-12-02 09:25:41.306859275 +0000 UTC m=+304.130733154" watchObservedRunningTime="2025-12-02 09:25:57.479036189 +0000 UTC m=+320.302910078" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.479595 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8pgk8" podStartSLOduration=42.656635098 podStartE2EDuration="2m52.479591721s" podCreationTimestamp="2025-12-02 09:23:05 +0000 UTC" firstStartedPulling="2025-12-02 09:23:08.729155492 +0000 UTC m=+151.553029371" lastFinishedPulling="2025-12-02 09:25:18.552112105 +0000 UTC m=+281.375985994" observedRunningTime="2025-12-02 09:25:41.194403392 +0000 UTC m=+304.018277271" watchObservedRunningTime="2025-12-02 09:25:57.479591721 +0000 UTC m=+320.303465590" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.482838 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8d48v","openshift-authentication/oauth-openshift-558db77b4-xqr9t","openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.482890 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.482908 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86f4ddc759-7xxp5"] Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.483423 4781 scope.go:117] "RemoveContainer" containerID="f0753cc6a646710f13d715b139f1f82a3b1ffb329b90856118103df28a591829" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.486769 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.499876 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.499856707 podStartE2EDuration="19.499856707s" podCreationTimestamp="2025-12-02 09:25:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:25:57.498003964 +0000 UTC m=+320.321877843" watchObservedRunningTime="2025-12-02 09:25:57.499856707 +0000 UTC m=+320.323730586" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.507001 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" path="/var/lib/kubelet/pods/5e5610cf-1c3f-4010-a4ee-4820372400c1/volumes" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.507670 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7602882-7ecf-4ce4-9cfa-3a691b3f9270" path="/var/lib/kubelet/pods/b7602882-7ecf-4ce4-9cfa-3a691b3f9270/volumes" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.548592 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.675349 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.697890 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.712537 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.896005 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.935018 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.970383 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 02 09:25:57 crc kubenswrapper[4781]: I1202 09:25:57.971569 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.174436 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.176120 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.192888 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.194369 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.245018 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-86f4ddc759-7xxp5_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb/oauth-openshift/1.log" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.245702 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" event={"ID":"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb","Type":"ContainerStarted","Data":"e6b3d3391ecb8272caff28a991b410bb211ab66847313da178ba369f96048c64"} Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.246464 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.271991 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" podStartSLOduration=78.271974385 podStartE2EDuration="1m18.271974385s" podCreationTimestamp="2025-12-02 09:24:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:25:47.202742904 +0000 UTC m=+310.026616863" watchObservedRunningTime="2025-12-02 09:25:58.271974385 +0000 UTC m=+321.095848264" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.286469 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.287908 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.316155 4781 patch_prober.go:28] interesting pod/oauth-openshift-86f4ddc759-7xxp5 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": read tcp 10.217.0.2:52824->10.217.0.56:6443: read: connection reset by peer" start-of-body= Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.316225 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" podUID="7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": read tcp 10.217.0.2:52824->10.217.0.56:6443: read: connection reset by peer" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.380868 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.396557 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.438775 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.517026 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.535862 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.655026 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.724811 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.755830 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.853121 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.927511 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 02 09:25:58 crc kubenswrapper[4781]: I1202 09:25:58.953412 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.006650 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.008193 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.078602 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.091230 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.189323 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.226062 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.254805 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-86f4ddc759-7xxp5_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb/oauth-openshift/2.log" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.255708 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-86f4ddc759-7xxp5_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb/oauth-openshift/1.log" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.255767 4781 generic.go:334] "Generic (PLEG): container finished" podID="7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" containerID="e6b3d3391ecb8272caff28a991b410bb211ab66847313da178ba369f96048c64" exitCode=255 Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.255814 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" event={"ID":"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb","Type":"ContainerDied","Data":"e6b3d3391ecb8272caff28a991b410bb211ab66847313da178ba369f96048c64"} Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.255875 4781 scope.go:117] "RemoveContainer" containerID="f0753cc6a646710f13d715b139f1f82a3b1ffb329b90856118103df28a591829" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.256443 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.257798 4781 scope.go:117] "RemoveContainer" containerID="e6b3d3391ecb8272caff28a991b410bb211ab66847313da178ba369f96048c64" Dec 02 09:25:59 crc kubenswrapper[4781]: E1202 09:25:59.259356 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-86f4ddc759-7xxp5_openshift-authentication(7b66cd08-6bc4-48ce-9002-eb87e6ae27eb)\"" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" podUID="7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.434892 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.435059 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.457655 4781 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.499459 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.509876 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.558119 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.626823 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.717119 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.746669 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.751127 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.779563 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.938243 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.945255 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 02 09:25:59 crc kubenswrapper[4781]: I1202 09:25:59.950880 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 02 09:26:00 crc kubenswrapper[4781]: I1202 09:26:00.046651 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 02 09:26:00 crc kubenswrapper[4781]: I1202 09:26:00.068172 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 02 09:26:00 crc kubenswrapper[4781]: I1202 09:26:00.125256 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 02 09:26:00 crc kubenswrapper[4781]: I1202 09:26:00.209231 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 02 09:26:00 crc kubenswrapper[4781]: I1202 09:26:00.262372 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-86f4ddc759-7xxp5_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb/oauth-openshift/2.log" Dec 02 09:26:00 crc kubenswrapper[4781]: I1202 09:26:00.263079 4781 scope.go:117] "RemoveContainer" containerID="e6b3d3391ecb8272caff28a991b410bb211ab66847313da178ba369f96048c64" Dec 02 09:26:00 crc kubenswrapper[4781]: E1202 09:26:00.263372 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-86f4ddc759-7xxp5_openshift-authentication(7b66cd08-6bc4-48ce-9002-eb87e6ae27eb)\"" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" podUID="7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" Dec 02 09:26:00 crc kubenswrapper[4781]: I1202 09:26:00.323636 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 02 09:26:00 crc kubenswrapper[4781]: I1202 09:26:00.394498 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 02 09:26:00 crc kubenswrapper[4781]: I1202 09:26:00.447466 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 02 09:26:00 crc kubenswrapper[4781]: I1202 09:26:00.475512 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 09:26:00 crc kubenswrapper[4781]: I1202 09:26:00.494478 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 09:26:00 crc kubenswrapper[4781]: I1202 09:26:00.513392 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 02 09:26:00 crc kubenswrapper[4781]: I1202 09:26:00.631208 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 09:26:00 crc kubenswrapper[4781]: I1202 09:26:00.718795 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 02 09:26:00 crc kubenswrapper[4781]: I1202 09:26:00.764506 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 02 09:26:00 crc kubenswrapper[4781]: I1202 09:26:00.990634 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 09:26:01 crc kubenswrapper[4781]: I1202 09:26:01.019615 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 02 09:26:01 crc kubenswrapper[4781]: I1202 09:26:01.078686 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 02 09:26:01 crc kubenswrapper[4781]: I1202 09:26:01.114283 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 02 09:26:01 crc kubenswrapper[4781]: I1202 09:26:01.163981 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 09:26:01 crc kubenswrapper[4781]: I1202 09:26:01.233776 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 02 09:26:01 crc kubenswrapper[4781]: I1202 09:26:01.295292 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 02 09:26:01 crc kubenswrapper[4781]: I1202 09:26:01.352884 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 02 09:26:01 crc kubenswrapper[4781]: I1202 09:26:01.458566 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 02 09:26:01 crc kubenswrapper[4781]: I1202 09:26:01.498538 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 09:26:01 crc kubenswrapper[4781]: I1202 09:26:01.511606 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 09:26:01 crc kubenswrapper[4781]: I1202 09:26:01.513349 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:26:01 crc kubenswrapper[4781]: I1202 09:26:01.513657 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:26:01 crc kubenswrapper[4781]: I1202 09:26:01.521899 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:26:01 crc kubenswrapper[4781]: I1202 09:26:01.712297 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 02 09:26:01 crc kubenswrapper[4781]: I1202 09:26:01.830822 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 02 09:26:01 crc kubenswrapper[4781]: I1202 09:26:01.857220 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 02 09:26:01 crc kubenswrapper[4781]: I1202 09:26:01.895010 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 09:26:02 crc kubenswrapper[4781]: I1202 09:26:02.129629 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 02 09:26:02 crc kubenswrapper[4781]: I1202 09:26:02.283195 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 09:26:02 crc kubenswrapper[4781]: I1202 09:26:02.367294 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 09:26:02 crc kubenswrapper[4781]: I1202 09:26:02.424363 4781 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 09:26:02 crc kubenswrapper[4781]: I1202 09:26:02.425067 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://a0c690fa3b5f0b02ca2432837f85440390764d9c722105890e90188c5b4a9baa" gracePeriod=5 Dec 02 09:26:02 crc kubenswrapper[4781]: I1202 09:26:02.430196 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 02 09:26:02 crc kubenswrapper[4781]: I1202 09:26:02.552400 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 02 09:26:02 crc kubenswrapper[4781]: I1202 09:26:02.652657 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 02 09:26:02 crc kubenswrapper[4781]: I1202 09:26:02.810896 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 02 09:26:02 crc kubenswrapper[4781]: I1202 09:26:02.859375 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 02 09:26:02 crc kubenswrapper[4781]: I1202 09:26:02.906483 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 02 09:26:02 crc kubenswrapper[4781]: I1202 09:26:02.956321 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 02 09:26:02 crc kubenswrapper[4781]: I1202 09:26:02.967787 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 02 09:26:02 crc kubenswrapper[4781]: I1202 09:26:02.989753 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 02 09:26:03 crc kubenswrapper[4781]: I1202 09:26:03.010909 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 09:26:03 crc kubenswrapper[4781]: I1202 09:26:03.016253 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 02 09:26:03 crc kubenswrapper[4781]: I1202 09:26:03.028340 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 02 09:26:03 crc kubenswrapper[4781]: I1202 09:26:03.104805 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 02 09:26:03 crc kubenswrapper[4781]: I1202 09:26:03.303144 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 02 09:26:03 crc kubenswrapper[4781]: I1202 09:26:03.351285 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 02 09:26:03 crc kubenswrapper[4781]: I1202 09:26:03.762152 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 02 09:26:03 crc kubenswrapper[4781]: I1202 09:26:03.924087 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 02 09:26:03 crc kubenswrapper[4781]: I1202 09:26:03.975343 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 02 09:26:04 crc kubenswrapper[4781]: I1202 09:26:04.053624 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 02 09:26:04 crc kubenswrapper[4781]: I1202 09:26:04.098831 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 09:26:04 crc kubenswrapper[4781]: I1202 09:26:04.114546 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 02 09:26:04 crc kubenswrapper[4781]: I1202 09:26:04.298463 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 02 09:26:04 crc kubenswrapper[4781]: I1202 09:26:04.782394 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 02 09:26:04 crc kubenswrapper[4781]: I1202 09:26:04.898468 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 02 09:26:04 crc kubenswrapper[4781]: I1202 09:26:04.934810 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 02 09:26:04 crc kubenswrapper[4781]: I1202 09:26:04.954793 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 02 09:26:05 crc kubenswrapper[4781]: I1202 09:26:05.015860 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 02 09:26:05 crc kubenswrapper[4781]: I1202 09:26:05.193946 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:26:05 crc kubenswrapper[4781]: I1202 09:26:05.194525 4781 scope.go:117] "RemoveContainer" containerID="e6b3d3391ecb8272caff28a991b410bb211ab66847313da178ba369f96048c64" Dec 02 09:26:05 crc kubenswrapper[4781]: E1202 09:26:05.194703 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-86f4ddc759-7xxp5_openshift-authentication(7b66cd08-6bc4-48ce-9002-eb87e6ae27eb)\"" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" podUID="7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" Dec 02 09:26:05 crc kubenswrapper[4781]: I1202 09:26:05.219321 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 02 09:26:05 crc kubenswrapper[4781]: I1202 09:26:05.236757 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 02 09:26:05 crc kubenswrapper[4781]: I1202 09:26:05.371380 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 09:26:05 crc kubenswrapper[4781]: I1202 09:26:05.414878 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 09:26:05 crc kubenswrapper[4781]: I1202 09:26:05.486881 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 02 09:26:05 crc kubenswrapper[4781]: I1202 09:26:05.525508 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 02 09:26:05 crc kubenswrapper[4781]: I1202 09:26:05.607043 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 02 09:26:05 crc kubenswrapper[4781]: I1202 09:26:05.691202 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 09:26:05 crc kubenswrapper[4781]: I1202 09:26:05.691729 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 02 09:26:05 crc kubenswrapper[4781]: I1202 09:26:05.868700 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 02 09:26:06 crc kubenswrapper[4781]: I1202 09:26:06.135531 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 02 09:26:06 crc kubenswrapper[4781]: I1202 09:26:06.382587 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 02 09:26:06 crc kubenswrapper[4781]: I1202 09:26:06.519610 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 02 09:26:06 crc kubenswrapper[4781]: I1202 09:26:06.894177 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 02 09:26:06 crc kubenswrapper[4781]: I1202 09:26:06.935849 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 02 09:26:07 crc kubenswrapper[4781]: I1202 09:26:07.197148 4781 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.210481 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.210793 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.309562 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.309603 4781 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="a0c690fa3b5f0b02ca2432837f85440390764d9c722105890e90188c5b4a9baa" exitCode=137 Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.309641 4781 scope.go:117] "RemoveContainer" containerID="a0c690fa3b5f0b02ca2432837f85440390764d9c722105890e90188c5b4a9baa" Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.309690 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.323104 4781 scope.go:117] "RemoveContainer" containerID="a0c690fa3b5f0b02ca2432837f85440390764d9c722105890e90188c5b4a9baa" Dec 02 09:26:08 crc kubenswrapper[4781]: E1202 09:26:08.323660 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0c690fa3b5f0b02ca2432837f85440390764d9c722105890e90188c5b4a9baa\": container with ID starting with a0c690fa3b5f0b02ca2432837f85440390764d9c722105890e90188c5b4a9baa not found: ID does not exist" containerID="a0c690fa3b5f0b02ca2432837f85440390764d9c722105890e90188c5b4a9baa" Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.323696 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c690fa3b5f0b02ca2432837f85440390764d9c722105890e90188c5b4a9baa"} err="failed to get container status \"a0c690fa3b5f0b02ca2432837f85440390764d9c722105890e90188c5b4a9baa\": rpc error: code = NotFound desc = could not find container \"a0c690fa3b5f0b02ca2432837f85440390764d9c722105890e90188c5b4a9baa\": container with ID starting with a0c690fa3b5f0b02ca2432837f85440390764d9c722105890e90188c5b4a9baa not found: ID does not exist" Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.406602 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.406705 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.406782 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.406836 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.406866 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.406904 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.406954 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.407017 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.407055 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.407247 4781 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.407268 4781 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.407285 4781 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.407302 4781 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.413757 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:26:08 crc kubenswrapper[4781]: I1202 09:26:08.508061 4781 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 09:26:09 crc kubenswrapper[4781]: I1202 09:26:09.505711 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 02 09:26:09 crc kubenswrapper[4781]: I1202 09:26:09.506291 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 02 09:26:09 crc kubenswrapper[4781]: I1202 09:26:09.515715 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 09:26:09 crc kubenswrapper[4781]: I1202 09:26:09.515754 4781 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8b3b3cc8-900a-4df9-9597-61911f32c13d" Dec 02 09:26:09 crc kubenswrapper[4781]: I1202 09:26:09.518964 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 09:26:09 crc kubenswrapper[4781]: I1202 09:26:09.518999 4781 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8b3b3cc8-900a-4df9-9597-61911f32c13d" Dec 02 09:26:15 crc kubenswrapper[4781]: I1202 09:26:15.168459 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 09:26:16 crc kubenswrapper[4781]: I1202 09:26:16.343388 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 02 09:26:17 crc kubenswrapper[4781]: I1202 09:26:17.157620 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 02 09:26:17 crc kubenswrapper[4781]: I1202 09:26:17.293556 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 09:26:17 crc kubenswrapper[4781]: I1202 09:26:17.504844 4781 scope.go:117] "RemoveContainer" containerID="e6b3d3391ecb8272caff28a991b410bb211ab66847313da178ba369f96048c64" Dec 02 09:26:17 crc kubenswrapper[4781]: E1202 09:26:17.505145 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-86f4ddc759-7xxp5_openshift-authentication(7b66cd08-6bc4-48ce-9002-eb87e6ae27eb)\"" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" podUID="7b66cd08-6bc4-48ce-9002-eb87e6ae27eb" Dec 02 09:26:18 crc kubenswrapper[4781]: I1202 09:26:18.876522 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 02 09:26:20 crc kubenswrapper[4781]: I1202 09:26:20.129823 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 02 09:26:21 crc kubenswrapper[4781]: I1202 09:26:21.444164 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 02 09:26:22 crc kubenswrapper[4781]: I1202 09:26:22.309106 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 02 09:26:22 crc kubenswrapper[4781]: I1202 09:26:22.337152 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 09:26:22 crc kubenswrapper[4781]: I1202 09:26:22.345609 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 02 09:26:22 crc kubenswrapper[4781]: I1202 09:26:22.375023 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 02 09:26:22 crc kubenswrapper[4781]: I1202 09:26:22.656631 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 02 09:26:22 crc kubenswrapper[4781]: I1202 09:26:22.952360 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 02 09:26:23 crc kubenswrapper[4781]: I1202 09:26:23.392514 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 02 09:26:23 crc kubenswrapper[4781]: I1202 09:26:23.403421 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 02 09:26:23 crc kubenswrapper[4781]: I1202 09:26:23.493991 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 02 09:26:23 crc kubenswrapper[4781]: I1202 09:26:23.939196 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 02 09:26:24 crc kubenswrapper[4781]: I1202 09:26:24.360351 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 02 09:26:24 crc kubenswrapper[4781]: I1202 09:26:24.881761 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 02 09:26:25 crc kubenswrapper[4781]: I1202 09:26:25.775831 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 02 09:26:25 crc kubenswrapper[4781]: I1202 09:26:25.814808 4781 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 02 09:26:25 crc kubenswrapper[4781]: I1202 09:26:25.900206 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 02 09:26:25 crc kubenswrapper[4781]: I1202 09:26:25.913690 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 02 09:26:26 crc kubenswrapper[4781]: I1202 09:26:26.788578 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 02 09:26:27 crc kubenswrapper[4781]: I1202 09:26:27.379781 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 02 09:26:27 crc kubenswrapper[4781]: I1202 09:26:27.645848 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 02 09:26:27 crc kubenswrapper[4781]: I1202 09:26:27.715623 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 02 09:26:27 crc kubenswrapper[4781]: I1202 09:26:27.751611 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 02 09:26:28 crc kubenswrapper[4781]: I1202 09:26:28.179725 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 02 09:26:28 crc kubenswrapper[4781]: I1202 09:26:28.472009 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 02 09:26:28 crc kubenswrapper[4781]: I1202 09:26:28.499765 4781 scope.go:117] "RemoveContainer" containerID="e6b3d3391ecb8272caff28a991b410bb211ab66847313da178ba369f96048c64" Dec 02 09:26:28 crc kubenswrapper[4781]: I1202 09:26:28.511851 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 02 09:26:28 crc kubenswrapper[4781]: I1202 09:26:28.534694 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 02 09:26:28 crc kubenswrapper[4781]: I1202 09:26:28.668811 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 02 09:26:28 crc kubenswrapper[4781]: I1202 09:26:28.680184 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 02 09:26:30 crc kubenswrapper[4781]: I1202 09:26:30.222815 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 09:26:30 crc kubenswrapper[4781]: I1202 09:26:30.314671 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 02 09:26:30 crc kubenswrapper[4781]: I1202 09:26:30.533440 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 02 09:26:30 crc kubenswrapper[4781]: I1202 09:26:30.546286 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 02 09:26:31 crc kubenswrapper[4781]: I1202 09:26:31.440561 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-86f4ddc759-7xxp5_7b66cd08-6bc4-48ce-9002-eb87e6ae27eb/oauth-openshift/2.log" Dec 02 09:26:31 crc kubenswrapper[4781]: I1202 09:26:31.440611 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" event={"ID":"7b66cd08-6bc4-48ce-9002-eb87e6ae27eb","Type":"ContainerStarted","Data":"3a77251971f34330794f4997c99bfdc04c896514ac1374f54fa9a7907edaac53"} Dec 02 09:26:31 crc kubenswrapper[4781]: I1202 09:26:31.441113 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:26:31 crc kubenswrapper[4781]: I1202 09:26:31.446298 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-86f4ddc759-7xxp5" Dec 02 09:26:31 crc kubenswrapper[4781]: I1202 09:26:31.592801 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 09:26:31 crc kubenswrapper[4781]: I1202 09:26:31.859074 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 02 09:26:32 crc kubenswrapper[4781]: I1202 09:26:32.269679 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 02 09:26:32 crc kubenswrapper[4781]: I1202 09:26:32.344072 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 02 09:26:32 crc kubenswrapper[4781]: I1202 09:26:32.430609 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 02 09:26:32 crc kubenswrapper[4781]: I1202 09:26:32.839042 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 02 09:26:33 crc kubenswrapper[4781]: I1202 09:26:33.056665 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 02 09:26:34 crc kubenswrapper[4781]: I1202 09:26:34.305031 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 02 09:26:34 crc kubenswrapper[4781]: I1202 09:26:34.599673 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 09:26:35 crc kubenswrapper[4781]: I1202 09:26:35.480167 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 02 09:26:36 crc kubenswrapper[4781]: I1202 09:26:36.662138 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 02 09:26:37 crc kubenswrapper[4781]: I1202 09:26:37.490344 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 02 09:26:37 crc kubenswrapper[4781]: I1202 09:26:37.673855 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 02 09:26:37 crc kubenswrapper[4781]: I1202 09:26:37.784068 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 02 09:26:38 crc kubenswrapper[4781]: I1202 09:26:38.328976 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 02 09:26:38 crc kubenswrapper[4781]: I1202 09:26:38.518332 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 02 09:26:38 crc kubenswrapper[4781]: I1202 09:26:38.931562 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 02 09:26:39 crc kubenswrapper[4781]: I1202 09:26:39.475090 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 02 09:26:40 crc kubenswrapper[4781]: I1202 09:26:40.064202 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 02 09:26:40 crc kubenswrapper[4781]: I1202 09:26:40.773214 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 02 09:26:41 crc kubenswrapper[4781]: I1202 09:26:41.654380 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 02 09:26:41 crc kubenswrapper[4781]: I1202 09:26:41.875845 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 09:26:42 crc kubenswrapper[4781]: I1202 09:26:42.268713 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 02 09:26:42 crc kubenswrapper[4781]: I1202 09:26:42.420897 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 02 09:26:43 crc kubenswrapper[4781]: I1202 09:26:43.769330 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 02 09:26:44 crc kubenswrapper[4781]: I1202 09:26:44.088394 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 02 09:26:44 crc kubenswrapper[4781]: I1202 09:26:44.101435 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 02 09:26:44 crc kubenswrapper[4781]: I1202 09:26:44.620681 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 09:26:45 crc kubenswrapper[4781]: I1202 09:26:45.769268 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 09:26:47 crc kubenswrapper[4781]: I1202 09:26:47.933592 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 02 09:26:54 crc kubenswrapper[4781]: I1202 09:26:54.578341 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6wrt7"] Dec 02 09:26:54 crc kubenswrapper[4781]: I1202 09:26:54.579172 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" podUID="4297bd58-caf6-4962-b775-7f454787fa91" containerName="controller-manager" containerID="cri-o://7f69e33611d273b03b3a67cee2000c7947df603600fd959b892c32e21bdcd19b" gracePeriod=30 Dec 02 09:26:54 crc kubenswrapper[4781]: I1202 09:26:54.669791 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl"] Dec 02 09:26:54 crc kubenswrapper[4781]: I1202 09:26:54.670106 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" podUID="2be1529f-3b01-4174-b029-7312871f5b97" containerName="route-controller-manager" containerID="cri-o://7b71c0945ab9a655dd3c4bd20c949888c381ff7c91c43ee5673af236b0846e66" gracePeriod=30 Dec 02 09:26:54 crc kubenswrapper[4781]: I1202 09:26:54.685697 4781 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6wrt7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 02 09:26:54 crc kubenswrapper[4781]: I1202 09:26:54.685782 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" podUID="4297bd58-caf6-4962-b775-7f454787fa91" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 02 09:26:54 crc kubenswrapper[4781]: I1202 09:26:54.687326 4781 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-w2stl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 02 09:26:54 crc kubenswrapper[4781]: I1202 09:26:54.687365 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" podUID="2be1529f-3b01-4174-b029-7312871f5b97" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.504700 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.579094 4781 generic.go:334] "Generic (PLEG): container finished" podID="4297bd58-caf6-4962-b775-7f454787fa91" containerID="7f69e33611d273b03b3a67cee2000c7947df603600fd959b892c32e21bdcd19b" exitCode=0 Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.579179 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" event={"ID":"4297bd58-caf6-4962-b775-7f454787fa91","Type":"ContainerDied","Data":"7f69e33611d273b03b3a67cee2000c7947df603600fd959b892c32e21bdcd19b"} Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.581021 4781 generic.go:334] "Generic (PLEG): container finished" podID="2be1529f-3b01-4174-b029-7312871f5b97" containerID="7b71c0945ab9a655dd3c4bd20c949888c381ff7c91c43ee5673af236b0846e66" exitCode=0 Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.581055 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" event={"ID":"2be1529f-3b01-4174-b029-7312871f5b97","Type":"ContainerDied","Data":"7b71c0945ab9a655dd3c4bd20c949888c381ff7c91c43ee5673af236b0846e66"} Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.581107 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.581129 4781 scope.go:117] "RemoveContainer" containerID="7b71c0945ab9a655dd3c4bd20c949888c381ff7c91c43ee5673af236b0846e66" Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.581073 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl" event={"ID":"2be1529f-3b01-4174-b029-7312871f5b97","Type":"ContainerDied","Data":"0236e756ac76500dfc2bad37fbc0485919411dbf55d389469b8ecbd72ea56aa1"} Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.595686 4781 scope.go:117] "RemoveContainer" containerID="7b71c0945ab9a655dd3c4bd20c949888c381ff7c91c43ee5673af236b0846e66" Dec 02 09:26:55 crc kubenswrapper[4781]: E1202 09:26:55.596512 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b71c0945ab9a655dd3c4bd20c949888c381ff7c91c43ee5673af236b0846e66\": container with ID starting with 7b71c0945ab9a655dd3c4bd20c949888c381ff7c91c43ee5673af236b0846e66 not found: ID does not exist" containerID="7b71c0945ab9a655dd3c4bd20c949888c381ff7c91c43ee5673af236b0846e66" Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.596552 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b71c0945ab9a655dd3c4bd20c949888c381ff7c91c43ee5673af236b0846e66"} err="failed to get container status \"7b71c0945ab9a655dd3c4bd20c949888c381ff7c91c43ee5673af236b0846e66\": rpc error: code = NotFound desc = could not find container \"7b71c0945ab9a655dd3c4bd20c949888c381ff7c91c43ee5673af236b0846e66\": container with ID starting with 7b71c0945ab9a655dd3c4bd20c949888c381ff7c91c43ee5673af236b0846e66 not found: ID does not exist" Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.617412 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be1529f-3b01-4174-b029-7312871f5b97-serving-cert\") pod \"2be1529f-3b01-4174-b029-7312871f5b97\" (UID: \"2be1529f-3b01-4174-b029-7312871f5b97\") " Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.617507 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2be1529f-3b01-4174-b029-7312871f5b97-client-ca\") pod \"2be1529f-3b01-4174-b029-7312871f5b97\" (UID: \"2be1529f-3b01-4174-b029-7312871f5b97\") " Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.617534 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g2wz\" (UniqueName: \"kubernetes.io/projected/2be1529f-3b01-4174-b029-7312871f5b97-kube-api-access-2g2wz\") pod \"2be1529f-3b01-4174-b029-7312871f5b97\" (UID: \"2be1529f-3b01-4174-b029-7312871f5b97\") " Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.617564 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be1529f-3b01-4174-b029-7312871f5b97-config\") pod \"2be1529f-3b01-4174-b029-7312871f5b97\" (UID: \"2be1529f-3b01-4174-b029-7312871f5b97\") " Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.618981 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be1529f-3b01-4174-b029-7312871f5b97-client-ca" (OuterVolumeSpecName: "client-ca") pod "2be1529f-3b01-4174-b029-7312871f5b97" (UID: "2be1529f-3b01-4174-b029-7312871f5b97"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.619195 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be1529f-3b01-4174-b029-7312871f5b97-config" (OuterVolumeSpecName: "config") pod "2be1529f-3b01-4174-b029-7312871f5b97" (UID: "2be1529f-3b01-4174-b029-7312871f5b97"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.627460 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2be1529f-3b01-4174-b029-7312871f5b97-kube-api-access-2g2wz" (OuterVolumeSpecName: "kube-api-access-2g2wz") pod "2be1529f-3b01-4174-b029-7312871f5b97" (UID: "2be1529f-3b01-4174-b029-7312871f5b97"). InnerVolumeSpecName "kube-api-access-2g2wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.627487 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2be1529f-3b01-4174-b029-7312871f5b97-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2be1529f-3b01-4174-b029-7312871f5b97" (UID: "2be1529f-3b01-4174-b029-7312871f5b97"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.719456 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be1529f-3b01-4174-b029-7312871f5b97-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.719492 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2be1529f-3b01-4174-b029-7312871f5b97-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.719502 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g2wz\" (UniqueName: \"kubernetes.io/projected/2be1529f-3b01-4174-b029-7312871f5b97-kube-api-access-2g2wz\") on node \"crc\" DevicePath \"\"" Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.719513 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be1529f-3b01-4174-b029-7312871f5b97-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.907019 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl"] Dec 02 09:26:55 crc kubenswrapper[4781]: I1202 09:26:55.912901 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w2stl"] Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.114990 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8"] Dec 02 09:26:56 crc kubenswrapper[4781]: E1202 09:26:56.115198 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" containerName="extract-utilities" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.115213 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" containerName="extract-utilities" Dec 02 09:26:56 crc kubenswrapper[4781]: E1202 09:26:56.115228 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" containerName="installer" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.115234 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" containerName="installer" Dec 02 09:26:56 crc kubenswrapper[4781]: E1202 09:26:56.115245 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" containerName="extract-content" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.115252 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" containerName="extract-content" Dec 02 09:26:56 crc kubenswrapper[4781]: E1202 09:26:56.115263 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be1529f-3b01-4174-b029-7312871f5b97" containerName="route-controller-manager" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.115268 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be1529f-3b01-4174-b029-7312871f5b97" containerName="route-controller-manager" Dec 02 09:26:56 crc kubenswrapper[4781]: E1202 09:26:56.115282 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.115290 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 09:26:56 crc kubenswrapper[4781]: E1202 09:26:56.115300 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" containerName="registry-server" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.115306 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" containerName="registry-server" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.115391 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2be1529f-3b01-4174-b029-7312871f5b97" containerName="route-controller-manager" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.115402 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.115412 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df92e64-6fc4-4395-a919-b1c46c30d318" containerName="installer" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.115419 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e5610cf-1c3f-4010-a4ee-4820372400c1" containerName="registry-server" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.115781 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.117653 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.118748 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.118984 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.119293 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.119469 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.119683 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.123724 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8"] Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.224842 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwszz\" (UniqueName: \"kubernetes.io/projected/430c3ad6-95a3-4493-a048-07d499e6f057-kube-api-access-cwszz\") pod \"route-controller-manager-77fd659786-gqxw8\" (UID: \"430c3ad6-95a3-4493-a048-07d499e6f057\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.224906 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/430c3ad6-95a3-4493-a048-07d499e6f057-config\") pod \"route-controller-manager-77fd659786-gqxw8\" (UID: \"430c3ad6-95a3-4493-a048-07d499e6f057\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.224953 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/430c3ad6-95a3-4493-a048-07d499e6f057-serving-cert\") pod \"route-controller-manager-77fd659786-gqxw8\" (UID: \"430c3ad6-95a3-4493-a048-07d499e6f057\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.225027 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/430c3ad6-95a3-4493-a048-07d499e6f057-client-ca\") pod \"route-controller-manager-77fd659786-gqxw8\" (UID: \"430c3ad6-95a3-4493-a048-07d499e6f057\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.326492 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwszz\" (UniqueName: \"kubernetes.io/projected/430c3ad6-95a3-4493-a048-07d499e6f057-kube-api-access-cwszz\") pod \"route-controller-manager-77fd659786-gqxw8\" (UID: \"430c3ad6-95a3-4493-a048-07d499e6f057\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.326550 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/430c3ad6-95a3-4493-a048-07d499e6f057-config\") pod \"route-controller-manager-77fd659786-gqxw8\" (UID: \"430c3ad6-95a3-4493-a048-07d499e6f057\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.326574 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/430c3ad6-95a3-4493-a048-07d499e6f057-serving-cert\") pod \"route-controller-manager-77fd659786-gqxw8\" (UID: \"430c3ad6-95a3-4493-a048-07d499e6f057\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.326617 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/430c3ad6-95a3-4493-a048-07d499e6f057-client-ca\") pod \"route-controller-manager-77fd659786-gqxw8\" (UID: \"430c3ad6-95a3-4493-a048-07d499e6f057\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.327525 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/430c3ad6-95a3-4493-a048-07d499e6f057-client-ca\") pod \"route-controller-manager-77fd659786-gqxw8\" (UID: \"430c3ad6-95a3-4493-a048-07d499e6f057\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.327747 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/430c3ad6-95a3-4493-a048-07d499e6f057-config\") pod \"route-controller-manager-77fd659786-gqxw8\" (UID: \"430c3ad6-95a3-4493-a048-07d499e6f057\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.329838 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/430c3ad6-95a3-4493-a048-07d499e6f057-serving-cert\") pod \"route-controller-manager-77fd659786-gqxw8\" (UID: \"430c3ad6-95a3-4493-a048-07d499e6f057\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.347005 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwszz\" (UniqueName: \"kubernetes.io/projected/430c3ad6-95a3-4493-a048-07d499e6f057-kube-api-access-cwszz\") pod \"route-controller-manager-77fd659786-gqxw8\" (UID: \"430c3ad6-95a3-4493-a048-07d499e6f057\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.430533 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:26:56 crc kubenswrapper[4781]: I1202 09:26:56.663998 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8"] Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.512076 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2be1529f-3b01-4174-b029-7312871f5b97" path="/var/lib/kubelet/pods/2be1529f-3b01-4174-b029-7312871f5b97/volumes" Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.598031 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" event={"ID":"430c3ad6-95a3-4493-a048-07d499e6f057","Type":"ContainerStarted","Data":"0e618e569252fbac385fde74e1cf84250183fc8c7e971f77de871c44cccd2b93"} Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.598080 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" event={"ID":"430c3ad6-95a3-4493-a048-07d499e6f057","Type":"ContainerStarted","Data":"1b8780b5c07f836330256957da93d97fab606aa48d5d85a8c7b512d478ab3b6c"} Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.821882 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.849487 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p"] Dec 02 09:26:57 crc kubenswrapper[4781]: E1202 09:26:57.849685 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4297bd58-caf6-4962-b775-7f454787fa91" containerName="controller-manager" Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.849695 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4297bd58-caf6-4962-b775-7f454787fa91" containerName="controller-manager" Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.849799 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4297bd58-caf6-4962-b775-7f454787fa91" containerName="controller-manager" Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.850138 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.870345 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p"] Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.950668 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnw8t\" (UniqueName: \"kubernetes.io/projected/4297bd58-caf6-4962-b775-7f454787fa91-kube-api-access-xnw8t\") pod \"4297bd58-caf6-4962-b775-7f454787fa91\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.950707 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-proxy-ca-bundles\") pod \"4297bd58-caf6-4962-b775-7f454787fa91\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.950759 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-client-ca\") pod \"4297bd58-caf6-4962-b775-7f454787fa91\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.950798 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-config\") pod \"4297bd58-caf6-4962-b775-7f454787fa91\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.950851 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4297bd58-caf6-4962-b775-7f454787fa91-serving-cert\") pod \"4297bd58-caf6-4962-b775-7f454787fa91\" (UID: \"4297bd58-caf6-4962-b775-7f454787fa91\") " Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.951433 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-client-ca" (OuterVolumeSpecName: "client-ca") pod "4297bd58-caf6-4962-b775-7f454787fa91" (UID: "4297bd58-caf6-4962-b775-7f454787fa91"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.951544 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-config" (OuterVolumeSpecName: "config") pod "4297bd58-caf6-4962-b775-7f454787fa91" (UID: "4297bd58-caf6-4962-b775-7f454787fa91"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.951700 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6a48c88-37b6-4b79-a98d-2ad5c1322833-serving-cert\") pod \"controller-manager-7bc5fc4fbc-lgt4p\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.951769 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-proxy-ca-bundles\") pod \"controller-manager-7bc5fc4fbc-lgt4p\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.951800 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-config\") pod \"controller-manager-7bc5fc4fbc-lgt4p\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.951822 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrzr8\" (UniqueName: \"kubernetes.io/projected/e6a48c88-37b6-4b79-a98d-2ad5c1322833-kube-api-access-hrzr8\") pod \"controller-manager-7bc5fc4fbc-lgt4p\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.951847 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-client-ca\") pod \"controller-manager-7bc5fc4fbc-lgt4p\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.951909 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.951938 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.952049 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4297bd58-caf6-4962-b775-7f454787fa91" (UID: "4297bd58-caf6-4962-b775-7f454787fa91"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.962175 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4297bd58-caf6-4962-b775-7f454787fa91-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4297bd58-caf6-4962-b775-7f454787fa91" (UID: "4297bd58-caf6-4962-b775-7f454787fa91"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:26:57 crc kubenswrapper[4781]: I1202 09:26:57.973581 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4297bd58-caf6-4962-b775-7f454787fa91-kube-api-access-xnw8t" (OuterVolumeSpecName: "kube-api-access-xnw8t") pod "4297bd58-caf6-4962-b775-7f454787fa91" (UID: "4297bd58-caf6-4962-b775-7f454787fa91"). InnerVolumeSpecName "kube-api-access-xnw8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.052729 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-config\") pod \"controller-manager-7bc5fc4fbc-lgt4p\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.052780 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrzr8\" (UniqueName: \"kubernetes.io/projected/e6a48c88-37b6-4b79-a98d-2ad5c1322833-kube-api-access-hrzr8\") pod \"controller-manager-7bc5fc4fbc-lgt4p\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.052812 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-client-ca\") pod \"controller-manager-7bc5fc4fbc-lgt4p\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.052850 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6a48c88-37b6-4b79-a98d-2ad5c1322833-serving-cert\") pod \"controller-manager-7bc5fc4fbc-lgt4p\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.052901 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-proxy-ca-bundles\") pod \"controller-manager-7bc5fc4fbc-lgt4p\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.052958 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnw8t\" (UniqueName: \"kubernetes.io/projected/4297bd58-caf6-4962-b775-7f454787fa91-kube-api-access-xnw8t\") on node \"crc\" DevicePath \"\"" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.052969 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4297bd58-caf6-4962-b775-7f454787fa91-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.052978 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4297bd58-caf6-4962-b775-7f454787fa91-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.053991 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-client-ca\") pod \"controller-manager-7bc5fc4fbc-lgt4p\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.054233 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-proxy-ca-bundles\") pod \"controller-manager-7bc5fc4fbc-lgt4p\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.055015 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-config\") pod \"controller-manager-7bc5fc4fbc-lgt4p\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.056811 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6a48c88-37b6-4b79-a98d-2ad5c1322833-serving-cert\") pod \"controller-manager-7bc5fc4fbc-lgt4p\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.074476 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrzr8\" (UniqueName: \"kubernetes.io/projected/e6a48c88-37b6-4b79-a98d-2ad5c1322833-kube-api-access-hrzr8\") pod \"controller-manager-7bc5fc4fbc-lgt4p\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.175781 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.416464 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p"] Dec 02 09:26:58 crc kubenswrapper[4781]: W1202 09:26:58.421761 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6a48c88_37b6_4b79_a98d_2ad5c1322833.slice/crio-41d382349be550dcda4885a982ba3d277f57371e264a533a4b4ac5f4e1118cfb WatchSource:0}: Error finding container 41d382349be550dcda4885a982ba3d277f57371e264a533a4b4ac5f4e1118cfb: Status 404 returned error can't find the container with id 41d382349be550dcda4885a982ba3d277f57371e264a533a4b4ac5f4e1118cfb Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.604217 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" event={"ID":"e6a48c88-37b6-4b79-a98d-2ad5c1322833","Type":"ContainerStarted","Data":"41d382349be550dcda4885a982ba3d277f57371e264a533a4b4ac5f4e1118cfb"} Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.606017 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" event={"ID":"4297bd58-caf6-4962-b775-7f454787fa91","Type":"ContainerDied","Data":"8110030450aa8b62bc6b932412e5a5f6c1844c01337d710f47b7983a0bbf3ddf"} Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.606057 4781 scope.go:117] "RemoveContainer" containerID="7f69e33611d273b03b3a67cee2000c7947df603600fd959b892c32e21bdcd19b" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.606028 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6wrt7" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.606236 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.611313 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.624484 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" podStartSLOduration=2.624463396 podStartE2EDuration="2.624463396s" podCreationTimestamp="2025-12-02 09:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:26:58.622315377 +0000 UTC m=+381.446189256" watchObservedRunningTime="2025-12-02 09:26:58.624463396 +0000 UTC m=+381.448337275" Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.639171 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6wrt7"] Dec 02 09:26:58 crc kubenswrapper[4781]: I1202 09:26:58.642759 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6wrt7"] Dec 02 09:26:59 crc kubenswrapper[4781]: I1202 09:26:59.507315 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4297bd58-caf6-4962-b775-7f454787fa91" path="/var/lib/kubelet/pods/4297bd58-caf6-4962-b775-7f454787fa91/volumes" Dec 02 09:26:59 crc kubenswrapper[4781]: I1202 09:26:59.613225 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" event={"ID":"e6a48c88-37b6-4b79-a98d-2ad5c1322833","Type":"ContainerStarted","Data":"5871fc04e35b21d12531da6936b136ec3ab7096e42976fda07a9e988f88a07fe"} Dec 02 09:26:59 crc kubenswrapper[4781]: I1202 09:26:59.613296 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:59 crc kubenswrapper[4781]: I1202 09:26:59.618000 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:26:59 crc kubenswrapper[4781]: I1202 09:26:59.630635 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" podStartSLOduration=3.630612705 podStartE2EDuration="3.630612705s" podCreationTimestamp="2025-12-02 09:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:26:59.629488624 +0000 UTC m=+382.453362503" watchObservedRunningTime="2025-12-02 09:26:59.630612705 +0000 UTC m=+382.454486584" Dec 02 09:27:00 crc kubenswrapper[4781]: I1202 09:27:00.412488 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:27:00 crc kubenswrapper[4781]: I1202 09:27:00.412775 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:27:15 crc kubenswrapper[4781]: I1202 09:27:15.884519 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p"] Dec 02 09:27:15 crc kubenswrapper[4781]: I1202 09:27:15.885355 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" podUID="e6a48c88-37b6-4b79-a98d-2ad5c1322833" containerName="controller-manager" containerID="cri-o://5871fc04e35b21d12531da6936b136ec3ab7096e42976fda07a9e988f88a07fe" gracePeriod=30 Dec 02 09:27:15 crc kubenswrapper[4781]: I1202 09:27:15.916536 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8"] Dec 02 09:27:15 crc kubenswrapper[4781]: I1202 09:27:15.916832 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" podUID="430c3ad6-95a3-4493-a048-07d499e6f057" containerName="route-controller-manager" containerID="cri-o://0e618e569252fbac385fde74e1cf84250183fc8c7e971f77de871c44cccd2b93" gracePeriod=30 Dec 02 09:27:16 crc kubenswrapper[4781]: I1202 09:27:16.431604 4781 patch_prober.go:28] interesting pod/route-controller-manager-77fd659786-gqxw8 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Dec 02 09:27:16 crc kubenswrapper[4781]: I1202 09:27:16.431665 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" podUID="430c3ad6-95a3-4493-a048-07d499e6f057" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Dec 02 09:27:17 crc kubenswrapper[4781]: I1202 09:27:17.713192 4781 generic.go:334] "Generic (PLEG): container finished" podID="e6a48c88-37b6-4b79-a98d-2ad5c1322833" containerID="5871fc04e35b21d12531da6936b136ec3ab7096e42976fda07a9e988f88a07fe" exitCode=0 Dec 02 09:27:17 crc kubenswrapper[4781]: I1202 09:27:17.713251 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" event={"ID":"e6a48c88-37b6-4b79-a98d-2ad5c1322833","Type":"ContainerDied","Data":"5871fc04e35b21d12531da6936b136ec3ab7096e42976fda07a9e988f88a07fe"} Dec 02 09:27:17 crc kubenswrapper[4781]: I1202 09:27:17.715854 4781 generic.go:334] "Generic (PLEG): container finished" podID="430c3ad6-95a3-4493-a048-07d499e6f057" containerID="0e618e569252fbac385fde74e1cf84250183fc8c7e971f77de871c44cccd2b93" exitCode=0 Dec 02 09:27:17 crc kubenswrapper[4781]: I1202 09:27:17.715886 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" event={"ID":"430c3ad6-95a3-4493-a048-07d499e6f057","Type":"ContainerDied","Data":"0e618e569252fbac385fde74e1cf84250183fc8c7e971f77de871c44cccd2b93"} Dec 02 09:27:17 crc kubenswrapper[4781]: I1202 09:27:17.906990 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:27:17 crc kubenswrapper[4781]: I1202 09:27:17.930964 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2"] Dec 02 09:27:17 crc kubenswrapper[4781]: E1202 09:27:17.931256 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="430c3ad6-95a3-4493-a048-07d499e6f057" containerName="route-controller-manager" Dec 02 09:27:17 crc kubenswrapper[4781]: I1202 09:27:17.931279 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="430c3ad6-95a3-4493-a048-07d499e6f057" containerName="route-controller-manager" Dec 02 09:27:17 crc kubenswrapper[4781]: I1202 09:27:17.931437 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="430c3ad6-95a3-4493-a048-07d499e6f057" containerName="route-controller-manager" Dec 02 09:27:17 crc kubenswrapper[4781]: I1202 09:27:17.931910 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:17 crc kubenswrapper[4781]: I1202 09:27:17.944158 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2"] Dec 02 09:27:17 crc kubenswrapper[4781]: I1202 09:27:17.957153 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.044583 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/430c3ad6-95a3-4493-a048-07d499e6f057-config\") pod \"430c3ad6-95a3-4493-a048-07d499e6f057\" (UID: \"430c3ad6-95a3-4493-a048-07d499e6f057\") " Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.044988 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/430c3ad6-95a3-4493-a048-07d499e6f057-client-ca\") pod \"430c3ad6-95a3-4493-a048-07d499e6f057\" (UID: \"430c3ad6-95a3-4493-a048-07d499e6f057\") " Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.045115 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrzr8\" (UniqueName: \"kubernetes.io/projected/e6a48c88-37b6-4b79-a98d-2ad5c1322833-kube-api-access-hrzr8\") pod \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.045187 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-config\") pod \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.045237 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-proxy-ca-bundles\") pod \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.045263 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwszz\" (UniqueName: \"kubernetes.io/projected/430c3ad6-95a3-4493-a048-07d499e6f057-kube-api-access-cwszz\") pod \"430c3ad6-95a3-4493-a048-07d499e6f057\" (UID: \"430c3ad6-95a3-4493-a048-07d499e6f057\") " Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.045299 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/430c3ad6-95a3-4493-a048-07d499e6f057-serving-cert\") pod \"430c3ad6-95a3-4493-a048-07d499e6f057\" (UID: \"430c3ad6-95a3-4493-a048-07d499e6f057\") " Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.045415 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/430c3ad6-95a3-4493-a048-07d499e6f057-config" (OuterVolumeSpecName: "config") pod "430c3ad6-95a3-4493-a048-07d499e6f057" (UID: "430c3ad6-95a3-4493-a048-07d499e6f057"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.045493 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0637fa1d-347d-4772-b199-4542ce66909c-config\") pod \"route-controller-manager-785fdfb9bc-x98w2\" (UID: \"0637fa1d-347d-4772-b199-4542ce66909c\") " pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.045525 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0637fa1d-347d-4772-b199-4542ce66909c-serving-cert\") pod \"route-controller-manager-785fdfb9bc-x98w2\" (UID: \"0637fa1d-347d-4772-b199-4542ce66909c\") " pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.045675 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt2qt\" (UniqueName: \"kubernetes.io/projected/0637fa1d-347d-4772-b199-4542ce66909c-kube-api-access-wt2qt\") pod \"route-controller-manager-785fdfb9bc-x98w2\" (UID: \"0637fa1d-347d-4772-b199-4542ce66909c\") " pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.045751 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0637fa1d-347d-4772-b199-4542ce66909c-client-ca\") pod \"route-controller-manager-785fdfb9bc-x98w2\" (UID: \"0637fa1d-347d-4772-b199-4542ce66909c\") " pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.045880 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/430c3ad6-95a3-4493-a048-07d499e6f057-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.045896 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e6a48c88-37b6-4b79-a98d-2ad5c1322833" (UID: "e6a48c88-37b6-4b79-a98d-2ad5c1322833"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.045946 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-config" (OuterVolumeSpecName: "config") pod "e6a48c88-37b6-4b79-a98d-2ad5c1322833" (UID: "e6a48c88-37b6-4b79-a98d-2ad5c1322833"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.046382 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/430c3ad6-95a3-4493-a048-07d499e6f057-client-ca" (OuterVolumeSpecName: "client-ca") pod "430c3ad6-95a3-4493-a048-07d499e6f057" (UID: "430c3ad6-95a3-4493-a048-07d499e6f057"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.053835 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6a48c88-37b6-4b79-a98d-2ad5c1322833-kube-api-access-hrzr8" (OuterVolumeSpecName: "kube-api-access-hrzr8") pod "e6a48c88-37b6-4b79-a98d-2ad5c1322833" (UID: "e6a48c88-37b6-4b79-a98d-2ad5c1322833"). InnerVolumeSpecName "kube-api-access-hrzr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.054008 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/430c3ad6-95a3-4493-a048-07d499e6f057-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "430c3ad6-95a3-4493-a048-07d499e6f057" (UID: "430c3ad6-95a3-4493-a048-07d499e6f057"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.054101 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/430c3ad6-95a3-4493-a048-07d499e6f057-kube-api-access-cwszz" (OuterVolumeSpecName: "kube-api-access-cwszz") pod "430c3ad6-95a3-4493-a048-07d499e6f057" (UID: "430c3ad6-95a3-4493-a048-07d499e6f057"). InnerVolumeSpecName "kube-api-access-cwszz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.146588 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6a48c88-37b6-4b79-a98d-2ad5c1322833-serving-cert\") pod \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.146640 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-client-ca\") pod \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\" (UID: \"e6a48c88-37b6-4b79-a98d-2ad5c1322833\") " Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.146779 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt2qt\" (UniqueName: \"kubernetes.io/projected/0637fa1d-347d-4772-b199-4542ce66909c-kube-api-access-wt2qt\") pod \"route-controller-manager-785fdfb9bc-x98w2\" (UID: \"0637fa1d-347d-4772-b199-4542ce66909c\") " pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.146830 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0637fa1d-347d-4772-b199-4542ce66909c-client-ca\") pod \"route-controller-manager-785fdfb9bc-x98w2\" (UID: \"0637fa1d-347d-4772-b199-4542ce66909c\") " pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.146893 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0637fa1d-347d-4772-b199-4542ce66909c-config\") pod \"route-controller-manager-785fdfb9bc-x98w2\" (UID: \"0637fa1d-347d-4772-b199-4542ce66909c\") " pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.146917 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0637fa1d-347d-4772-b199-4542ce66909c-serving-cert\") pod \"route-controller-manager-785fdfb9bc-x98w2\" (UID: \"0637fa1d-347d-4772-b199-4542ce66909c\") " pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.147458 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/430c3ad6-95a3-4493-a048-07d499e6f057-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.147477 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrzr8\" (UniqueName: \"kubernetes.io/projected/e6a48c88-37b6-4b79-a98d-2ad5c1322833-kube-api-access-hrzr8\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.147487 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.147498 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.147509 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwszz\" (UniqueName: \"kubernetes.io/projected/430c3ad6-95a3-4493-a048-07d499e6f057-kube-api-access-cwszz\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.147520 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/430c3ad6-95a3-4493-a048-07d499e6f057-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.147529 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-client-ca" (OuterVolumeSpecName: "client-ca") pod "e6a48c88-37b6-4b79-a98d-2ad5c1322833" (UID: "e6a48c88-37b6-4b79-a98d-2ad5c1322833"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.148073 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0637fa1d-347d-4772-b199-4542ce66909c-client-ca\") pod \"route-controller-manager-785fdfb9bc-x98w2\" (UID: \"0637fa1d-347d-4772-b199-4542ce66909c\") " pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.148430 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0637fa1d-347d-4772-b199-4542ce66909c-config\") pod \"route-controller-manager-785fdfb9bc-x98w2\" (UID: \"0637fa1d-347d-4772-b199-4542ce66909c\") " pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.149369 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a48c88-37b6-4b79-a98d-2ad5c1322833-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e6a48c88-37b6-4b79-a98d-2ad5c1322833" (UID: "e6a48c88-37b6-4b79-a98d-2ad5c1322833"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.152912 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0637fa1d-347d-4772-b199-4542ce66909c-serving-cert\") pod \"route-controller-manager-785fdfb9bc-x98w2\" (UID: \"0637fa1d-347d-4772-b199-4542ce66909c\") " pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.165882 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt2qt\" (UniqueName: \"kubernetes.io/projected/0637fa1d-347d-4772-b199-4542ce66909c-kube-api-access-wt2qt\") pod \"route-controller-manager-785fdfb9bc-x98w2\" (UID: \"0637fa1d-347d-4772-b199-4542ce66909c\") " pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.247965 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6a48c88-37b6-4b79-a98d-2ad5c1322833-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.247999 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6a48c88-37b6-4b79-a98d-2ad5c1322833-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.271646 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.462650 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2"] Dec 02 09:27:18 crc kubenswrapper[4781]: W1202 09:27:18.470499 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0637fa1d_347d_4772_b199_4542ce66909c.slice/crio-ee02049efbcd0ff004b7ba56cc6c7de1b63e774f5eb931c3e38f883b59df1aa2 WatchSource:0}: Error finding container ee02049efbcd0ff004b7ba56cc6c7de1b63e774f5eb931c3e38f883b59df1aa2: Status 404 returned error can't find the container with id ee02049efbcd0ff004b7ba56cc6c7de1b63e774f5eb931c3e38f883b59df1aa2 Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.721219 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" event={"ID":"e6a48c88-37b6-4b79-a98d-2ad5c1322833","Type":"ContainerDied","Data":"41d382349be550dcda4885a982ba3d277f57371e264a533a4b4ac5f4e1118cfb"} Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.721248 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.721481 4781 scope.go:117] "RemoveContainer" containerID="5871fc04e35b21d12531da6936b136ec3ab7096e42976fda07a9e988f88a07fe" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.723736 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" event={"ID":"430c3ad6-95a3-4493-a048-07d499e6f057","Type":"ContainerDied","Data":"1b8780b5c07f836330256957da93d97fab606aa48d5d85a8c7b512d478ab3b6c"} Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.723792 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.729891 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" event={"ID":"0637fa1d-347d-4772-b199-4542ce66909c","Type":"ContainerStarted","Data":"73da9b5c5d0f3dea994abb2c4b0ece6f304d51b1816084ebdcee1a472e5aae6a"} Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.729940 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" event={"ID":"0637fa1d-347d-4772-b199-4542ce66909c","Type":"ContainerStarted","Data":"ee02049efbcd0ff004b7ba56cc6c7de1b63e774f5eb931c3e38f883b59df1aa2"} Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.731149 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.739756 4781 scope.go:117] "RemoveContainer" containerID="0e618e569252fbac385fde74e1cf84250183fc8c7e971f77de871c44cccd2b93" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.746761 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" podStartSLOduration=3.746725535 podStartE2EDuration="3.746725535s" podCreationTimestamp="2025-12-02 09:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:27:18.745743839 +0000 UTC m=+401.569617718" watchObservedRunningTime="2025-12-02 09:27:18.746725535 +0000 UTC m=+401.570599414" Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.762356 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8"] Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.770082 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77fd659786-gqxw8"] Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.779771 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p"] Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.796220 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7bc5fc4fbc-lgt4p"] Dec 02 09:27:18 crc kubenswrapper[4781]: I1202 09:27:18.960179 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:19 crc kubenswrapper[4781]: I1202 09:27:19.505343 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="430c3ad6-95a3-4493-a048-07d499e6f057" path="/var/lib/kubelet/pods/430c3ad6-95a3-4493-a048-07d499e6f057/volumes" Dec 02 09:27:19 crc kubenswrapper[4781]: I1202 09:27:19.506374 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6a48c88-37b6-4b79-a98d-2ad5c1322833" path="/var/lib/kubelet/pods/e6a48c88-37b6-4b79-a98d-2ad5c1322833/volumes" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.667124 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c8985447-lrspd"] Dec 02 09:27:20 crc kubenswrapper[4781]: E1202 09:27:20.667567 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a48c88-37b6-4b79-a98d-2ad5c1322833" containerName="controller-manager" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.667580 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a48c88-37b6-4b79-a98d-2ad5c1322833" containerName="controller-manager" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.667714 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6a48c88-37b6-4b79-a98d-2ad5c1322833" containerName="controller-manager" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.668156 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.671047 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.671340 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.671687 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.671966 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.672172 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.672298 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b6358d4-1fac-471b-b465-74f1531f87a1-serving-cert\") pod \"controller-manager-5c8985447-lrspd\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.672403 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9wxp\" (UniqueName: \"kubernetes.io/projected/8b6358d4-1fac-471b-b465-74f1531f87a1-kube-api-access-w9wxp\") pod \"controller-manager-5c8985447-lrspd\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.672516 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-client-ca\") pod \"controller-manager-5c8985447-lrspd\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.672554 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-proxy-ca-bundles\") pod \"controller-manager-5c8985447-lrspd\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.672578 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-config\") pod \"controller-manager-5c8985447-lrspd\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.673345 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.681845 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.681997 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c8985447-lrspd"] Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.773243 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b6358d4-1fac-471b-b465-74f1531f87a1-serving-cert\") pod \"controller-manager-5c8985447-lrspd\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.773323 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9wxp\" (UniqueName: \"kubernetes.io/projected/8b6358d4-1fac-471b-b465-74f1531f87a1-kube-api-access-w9wxp\") pod \"controller-manager-5c8985447-lrspd\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.773369 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-client-ca\") pod \"controller-manager-5c8985447-lrspd\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.773402 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-proxy-ca-bundles\") pod \"controller-manager-5c8985447-lrspd\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.773430 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-config\") pod \"controller-manager-5c8985447-lrspd\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.774822 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-config\") pod \"controller-manager-5c8985447-lrspd\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.775019 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-client-ca\") pod \"controller-manager-5c8985447-lrspd\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.775299 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-proxy-ca-bundles\") pod \"controller-manager-5c8985447-lrspd\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.778882 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b6358d4-1fac-471b-b465-74f1531f87a1-serving-cert\") pod \"controller-manager-5c8985447-lrspd\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.793512 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9wxp\" (UniqueName: \"kubernetes.io/projected/8b6358d4-1fac-471b-b465-74f1531f87a1-kube-api-access-w9wxp\") pod \"controller-manager-5c8985447-lrspd\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:20 crc kubenswrapper[4781]: I1202 09:27:20.983765 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:21 crc kubenswrapper[4781]: I1202 09:27:21.156007 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c8985447-lrspd"] Dec 02 09:27:21 crc kubenswrapper[4781]: I1202 09:27:21.747805 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" event={"ID":"8b6358d4-1fac-471b-b465-74f1531f87a1","Type":"ContainerStarted","Data":"ee8e5a5fff3ebebc79f516f255f6a0b3e5a677f0c98b916c1a5325c19f9ec9a6"} Dec 02 09:27:21 crc kubenswrapper[4781]: I1202 09:27:21.747853 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" event={"ID":"8b6358d4-1fac-471b-b465-74f1531f87a1","Type":"ContainerStarted","Data":"3a352510e8e4e7a4a845d3dfdd70be5eb4f5649b29fc61b0ab78166f018e7ef5"} Dec 02 09:27:21 crc kubenswrapper[4781]: I1202 09:27:21.748093 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:21 crc kubenswrapper[4781]: I1202 09:27:21.754060 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:21 crc kubenswrapper[4781]: I1202 09:27:21.765632 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" podStartSLOduration=6.765610375 podStartE2EDuration="6.765610375s" podCreationTimestamp="2025-12-02 09:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:27:21.765506742 +0000 UTC m=+404.589380621" watchObservedRunningTime="2025-12-02 09:27:21.765610375 +0000 UTC m=+404.589484254" Dec 02 09:27:23 crc kubenswrapper[4781]: I1202 09:27:23.788726 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-g6kvc"] Dec 02 09:27:23 crc kubenswrapper[4781]: I1202 09:27:23.789964 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:23 crc kubenswrapper[4781]: I1202 09:27:23.839338 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-g6kvc"] Dec 02 09:27:23 crc kubenswrapper[4781]: I1202 09:27:23.914495 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:23 crc kubenswrapper[4781]: I1202 09:27:23.914565 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5mvd\" (UniqueName: \"kubernetes.io/projected/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-kube-api-access-f5mvd\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:23 crc kubenswrapper[4781]: I1202 09:27:23.914678 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:23 crc kubenswrapper[4781]: I1202 09:27:23.914713 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-trusted-ca\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:23 crc kubenswrapper[4781]: I1202 09:27:23.914737 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-registry-certificates\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:23 crc kubenswrapper[4781]: I1202 09:27:23.914777 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-bound-sa-token\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:23 crc kubenswrapper[4781]: I1202 09:27:23.914904 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-registry-tls\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:23 crc kubenswrapper[4781]: I1202 09:27:23.915016 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:23 crc kubenswrapper[4781]: I1202 09:27:23.947403 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:24 crc kubenswrapper[4781]: I1202 09:27:24.016500 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:24 crc kubenswrapper[4781]: I1202 09:27:24.016562 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:24 crc kubenswrapper[4781]: I1202 09:27:24.016589 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5mvd\" (UniqueName: \"kubernetes.io/projected/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-kube-api-access-f5mvd\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:24 crc kubenswrapper[4781]: I1202 09:27:24.016637 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-trusted-ca\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:24 crc kubenswrapper[4781]: I1202 09:27:24.017016 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-registry-certificates\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:24 crc kubenswrapper[4781]: I1202 09:27:24.017043 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-bound-sa-token\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:24 crc kubenswrapper[4781]: I1202 09:27:24.017205 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:24 crc kubenswrapper[4781]: I1202 09:27:24.018136 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-registry-certificates\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:24 crc kubenswrapper[4781]: I1202 09:27:24.018203 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-registry-tls\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:24 crc kubenswrapper[4781]: I1202 09:27:24.018492 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-trusted-ca\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:24 crc kubenswrapper[4781]: I1202 09:27:24.022790 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:24 crc kubenswrapper[4781]: I1202 09:27:24.022882 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-registry-tls\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:24 crc kubenswrapper[4781]: I1202 09:27:24.032983 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5mvd\" (UniqueName: \"kubernetes.io/projected/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-kube-api-access-f5mvd\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:24 crc kubenswrapper[4781]: I1202 09:27:24.035592 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f-bound-sa-token\") pod \"image-registry-66df7c8f76-g6kvc\" (UID: \"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:24 crc kubenswrapper[4781]: I1202 09:27:24.108629 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:24 crc kubenswrapper[4781]: I1202 09:27:24.514894 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-g6kvc"] Dec 02 09:27:24 crc kubenswrapper[4781]: W1202 09:27:24.522439 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e1dba2f_84f6_4e62_8ec3_e2bef189bb8f.slice/crio-7cdbe2e2b543b4711ea076f461800b2a44e7cb34a4ae8282266129f9d757324c WatchSource:0}: Error finding container 7cdbe2e2b543b4711ea076f461800b2a44e7cb34a4ae8282266129f9d757324c: Status 404 returned error can't find the container with id 7cdbe2e2b543b4711ea076f461800b2a44e7cb34a4ae8282266129f9d757324c Dec 02 09:27:24 crc kubenswrapper[4781]: I1202 09:27:24.765220 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" event={"ID":"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f","Type":"ContainerStarted","Data":"7cdbe2e2b543b4711ea076f461800b2a44e7cb34a4ae8282266129f9d757324c"} Dec 02 09:27:27 crc kubenswrapper[4781]: I1202 09:27:27.506426 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t59n7"] Dec 02 09:27:27 crc kubenswrapper[4781]: I1202 09:27:27.506887 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t59n7" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" containerName="registry-server" containerID="cri-o://c067d66a79d99aa5bd42a28b115e25c993de6df62c915afd46daccbbdf6f2ad9" gracePeriod=2 Dec 02 09:27:27 crc kubenswrapper[4781]: I1202 09:27:27.782030 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" event={"ID":"9e1dba2f-84f6-4e62-8ec3-e2bef189bb8f","Type":"ContainerStarted","Data":"29281c16f25405af600068ce76a7295a2ff8b2e4764270af02dffe40b02d81fd"} Dec 02 09:27:28 crc kubenswrapper[4781]: I1202 09:27:28.803858 4781 generic.go:334] "Generic (PLEG): container finished" podID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" containerID="c067d66a79d99aa5bd42a28b115e25c993de6df62c915afd46daccbbdf6f2ad9" exitCode=0 Dec 02 09:27:28 crc kubenswrapper[4781]: I1202 09:27:28.803908 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t59n7" event={"ID":"0a0d43df-4480-4b62-bd3d-d129fbdcd722","Type":"ContainerDied","Data":"c067d66a79d99aa5bd42a28b115e25c993de6df62c915afd46daccbbdf6f2ad9"} Dec 02 09:27:28 crc kubenswrapper[4781]: I1202 09:27:28.828482 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" podStartSLOduration=5.828465589 podStartE2EDuration="5.828465589s" podCreationTimestamp="2025-12-02 09:27:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:27:28.826533256 +0000 UTC m=+411.650407145" watchObservedRunningTime="2025-12-02 09:27:28.828465589 +0000 UTC m=+411.652339468" Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.062314 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t59n7" Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.195940 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45gcq\" (UniqueName: \"kubernetes.io/projected/0a0d43df-4480-4b62-bd3d-d129fbdcd722-kube-api-access-45gcq\") pod \"0a0d43df-4480-4b62-bd3d-d129fbdcd722\" (UID: \"0a0d43df-4480-4b62-bd3d-d129fbdcd722\") " Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.196041 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a0d43df-4480-4b62-bd3d-d129fbdcd722-utilities\") pod \"0a0d43df-4480-4b62-bd3d-d129fbdcd722\" (UID: \"0a0d43df-4480-4b62-bd3d-d129fbdcd722\") " Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.196110 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a0d43df-4480-4b62-bd3d-d129fbdcd722-catalog-content\") pod \"0a0d43df-4480-4b62-bd3d-d129fbdcd722\" (UID: \"0a0d43df-4480-4b62-bd3d-d129fbdcd722\") " Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.197274 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a0d43df-4480-4b62-bd3d-d129fbdcd722-utilities" (OuterVolumeSpecName: "utilities") pod "0a0d43df-4480-4b62-bd3d-d129fbdcd722" (UID: "0a0d43df-4480-4b62-bd3d-d129fbdcd722"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.202212 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a0d43df-4480-4b62-bd3d-d129fbdcd722-kube-api-access-45gcq" (OuterVolumeSpecName: "kube-api-access-45gcq") pod "0a0d43df-4480-4b62-bd3d-d129fbdcd722" (UID: "0a0d43df-4480-4b62-bd3d-d129fbdcd722"). InnerVolumeSpecName "kube-api-access-45gcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.244853 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a0d43df-4480-4b62-bd3d-d129fbdcd722-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a0d43df-4480-4b62-bd3d-d129fbdcd722" (UID: "0a0d43df-4480-4b62-bd3d-d129fbdcd722"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.297519 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a0d43df-4480-4b62-bd3d-d129fbdcd722-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.297573 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45gcq\" (UniqueName: \"kubernetes.io/projected/0a0d43df-4480-4b62-bd3d-d129fbdcd722-kube-api-access-45gcq\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.297595 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a0d43df-4480-4b62-bd3d-d129fbdcd722-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.811405 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t59n7" event={"ID":"0a0d43df-4480-4b62-bd3d-d129fbdcd722","Type":"ContainerDied","Data":"55acb068bdb880988b70af6a22949feb297af148d64358c4ce1bc283e7a20d47"} Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.811467 4781 scope.go:117] "RemoveContainer" containerID="c067d66a79d99aa5bd42a28b115e25c993de6df62c915afd46daccbbdf6f2ad9" Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.811473 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t59n7" Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.828355 4781 scope.go:117] "RemoveContainer" containerID="12da46a80c1ab5d1ec3d83da4241389ab56aacfeabcef93a32503751f5f4a13d" Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.844826 4781 scope.go:117] "RemoveContainer" containerID="4b2f67bea61bb2389d0202658b9278243091d0e6578c1be014a78ce40b42cf4b" Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.846352 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t59n7"] Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.849294 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t59n7"] Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.901755 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4rfsl"] Dec 02 09:27:29 crc kubenswrapper[4781]: I1202 09:27:29.901996 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4rfsl" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" containerName="registry-server" containerID="cri-o://364f1edb119e7c0e66eb6574c505409a5e4b5e9cd9107277ad4c026b1dcf7ac8" gracePeriod=2 Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.320172 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4rfsl" Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.412496 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.412564 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.517129 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37a1838-a130-4cb4-807f-2214cbfefdbf-utilities\") pod \"f37a1838-a130-4cb4-807f-2214cbfefdbf\" (UID: \"f37a1838-a130-4cb4-807f-2214cbfefdbf\") " Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.517194 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkkgc\" (UniqueName: \"kubernetes.io/projected/f37a1838-a130-4cb4-807f-2214cbfefdbf-kube-api-access-lkkgc\") pod \"f37a1838-a130-4cb4-807f-2214cbfefdbf\" (UID: \"f37a1838-a130-4cb4-807f-2214cbfefdbf\") " Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.517300 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37a1838-a130-4cb4-807f-2214cbfefdbf-catalog-content\") pod \"f37a1838-a130-4cb4-807f-2214cbfefdbf\" (UID: \"f37a1838-a130-4cb4-807f-2214cbfefdbf\") " Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.518559 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37a1838-a130-4cb4-807f-2214cbfefdbf-utilities" (OuterVolumeSpecName: "utilities") pod "f37a1838-a130-4cb4-807f-2214cbfefdbf" (UID: "f37a1838-a130-4cb4-807f-2214cbfefdbf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.523037 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f37a1838-a130-4cb4-807f-2214cbfefdbf-kube-api-access-lkkgc" (OuterVolumeSpecName: "kube-api-access-lkkgc") pod "f37a1838-a130-4cb4-807f-2214cbfefdbf" (UID: "f37a1838-a130-4cb4-807f-2214cbfefdbf"). InnerVolumeSpecName "kube-api-access-lkkgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.618813 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37a1838-a130-4cb4-807f-2214cbfefdbf-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.618848 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkkgc\" (UniqueName: \"kubernetes.io/projected/f37a1838-a130-4cb4-807f-2214cbfefdbf-kube-api-access-lkkgc\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.622587 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37a1838-a130-4cb4-807f-2214cbfefdbf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f37a1838-a130-4cb4-807f-2214cbfefdbf" (UID: "f37a1838-a130-4cb4-807f-2214cbfefdbf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.719748 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37a1838-a130-4cb4-807f-2214cbfefdbf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.822759 4781 generic.go:334] "Generic (PLEG): container finished" podID="f37a1838-a130-4cb4-807f-2214cbfefdbf" containerID="364f1edb119e7c0e66eb6574c505409a5e4b5e9cd9107277ad4c026b1dcf7ac8" exitCode=0 Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.822800 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4rfsl" Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.822811 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rfsl" event={"ID":"f37a1838-a130-4cb4-807f-2214cbfefdbf","Type":"ContainerDied","Data":"364f1edb119e7c0e66eb6574c505409a5e4b5e9cd9107277ad4c026b1dcf7ac8"} Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.822848 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4rfsl" event={"ID":"f37a1838-a130-4cb4-807f-2214cbfefdbf","Type":"ContainerDied","Data":"8f7cb44349f736d9c10f68dfbea0a4e4d6cdb229a06a65775aa9b950ac5c75a6"} Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.822862 4781 scope.go:117] "RemoveContainer" containerID="364f1edb119e7c0e66eb6574c505409a5e4b5e9cd9107277ad4c026b1dcf7ac8" Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.842850 4781 scope.go:117] "RemoveContainer" containerID="efca497eaa2387782b2dcd379186015fe7ef578c950f2d847acc2e88bdda90f8" Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.846051 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4rfsl"] Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.851880 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4rfsl"] Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.870591 4781 scope.go:117] "RemoveContainer" containerID="1d8977f6fa5964fd9ed6bab3633602200974f6a082e5197d046c86e9771a0e9b" Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.882178 4781 scope.go:117] "RemoveContainer" containerID="364f1edb119e7c0e66eb6574c505409a5e4b5e9cd9107277ad4c026b1dcf7ac8" Dec 02 09:27:30 crc kubenswrapper[4781]: E1202 09:27:30.882508 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"364f1edb119e7c0e66eb6574c505409a5e4b5e9cd9107277ad4c026b1dcf7ac8\": container with ID starting with 364f1edb119e7c0e66eb6574c505409a5e4b5e9cd9107277ad4c026b1dcf7ac8 not found: ID does not exist" containerID="364f1edb119e7c0e66eb6574c505409a5e4b5e9cd9107277ad4c026b1dcf7ac8" Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.882546 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"364f1edb119e7c0e66eb6574c505409a5e4b5e9cd9107277ad4c026b1dcf7ac8"} err="failed to get container status \"364f1edb119e7c0e66eb6574c505409a5e4b5e9cd9107277ad4c026b1dcf7ac8\": rpc error: code = NotFound desc = could not find container \"364f1edb119e7c0e66eb6574c505409a5e4b5e9cd9107277ad4c026b1dcf7ac8\": container with ID starting with 364f1edb119e7c0e66eb6574c505409a5e4b5e9cd9107277ad4c026b1dcf7ac8 not found: ID does not exist" Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.882571 4781 scope.go:117] "RemoveContainer" containerID="efca497eaa2387782b2dcd379186015fe7ef578c950f2d847acc2e88bdda90f8" Dec 02 09:27:30 crc kubenswrapper[4781]: E1202 09:27:30.882973 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efca497eaa2387782b2dcd379186015fe7ef578c950f2d847acc2e88bdda90f8\": container with ID starting with efca497eaa2387782b2dcd379186015fe7ef578c950f2d847acc2e88bdda90f8 not found: ID does not exist" containerID="efca497eaa2387782b2dcd379186015fe7ef578c950f2d847acc2e88bdda90f8" Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.882999 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efca497eaa2387782b2dcd379186015fe7ef578c950f2d847acc2e88bdda90f8"} err="failed to get container status \"efca497eaa2387782b2dcd379186015fe7ef578c950f2d847acc2e88bdda90f8\": rpc error: code = NotFound desc = could not find container \"efca497eaa2387782b2dcd379186015fe7ef578c950f2d847acc2e88bdda90f8\": container with ID starting with efca497eaa2387782b2dcd379186015fe7ef578c950f2d847acc2e88bdda90f8 not found: ID does not exist" Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.883013 4781 scope.go:117] "RemoveContainer" containerID="1d8977f6fa5964fd9ed6bab3633602200974f6a082e5197d046c86e9771a0e9b" Dec 02 09:27:30 crc kubenswrapper[4781]: E1202 09:27:30.883241 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d8977f6fa5964fd9ed6bab3633602200974f6a082e5197d046c86e9771a0e9b\": container with ID starting with 1d8977f6fa5964fd9ed6bab3633602200974f6a082e5197d046c86e9771a0e9b not found: ID does not exist" containerID="1d8977f6fa5964fd9ed6bab3633602200974f6a082e5197d046c86e9771a0e9b" Dec 02 09:27:30 crc kubenswrapper[4781]: I1202 09:27:30.883269 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d8977f6fa5964fd9ed6bab3633602200974f6a082e5197d046c86e9771a0e9b"} err="failed to get container status \"1d8977f6fa5964fd9ed6bab3633602200974f6a082e5197d046c86e9771a0e9b\": rpc error: code = NotFound desc = could not find container \"1d8977f6fa5964fd9ed6bab3633602200974f6a082e5197d046c86e9771a0e9b\": container with ID starting with 1d8977f6fa5964fd9ed6bab3633602200974f6a082e5197d046c86e9771a0e9b not found: ID does not exist" Dec 02 09:27:31 crc kubenswrapper[4781]: I1202 09:27:31.505755 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" path="/var/lib/kubelet/pods/0a0d43df-4480-4b62-bd3d-d129fbdcd722/volumes" Dec 02 09:27:31 crc kubenswrapper[4781]: I1202 09:27:31.506440 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" path="/var/lib/kubelet/pods/f37a1838-a130-4cb4-807f-2214cbfefdbf/volumes" Dec 02 09:27:34 crc kubenswrapper[4781]: I1202 09:27:34.109686 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:34 crc kubenswrapper[4781]: I1202 09:27:34.573851 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2"] Dec 02 09:27:34 crc kubenswrapper[4781]: I1202 09:27:34.574626 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" podUID="0637fa1d-347d-4772-b199-4542ce66909c" containerName="route-controller-manager" containerID="cri-o://73da9b5c5d0f3dea994abb2c4b0ece6f304d51b1816084ebdcee1a472e5aae6a" gracePeriod=30 Dec 02 09:27:35 crc kubenswrapper[4781]: I1202 09:27:35.859666 4781 generic.go:334] "Generic (PLEG): container finished" podID="0637fa1d-347d-4772-b199-4542ce66909c" containerID="73da9b5c5d0f3dea994abb2c4b0ece6f304d51b1816084ebdcee1a472e5aae6a" exitCode=0 Dec 02 09:27:35 crc kubenswrapper[4781]: I1202 09:27:35.859761 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" event={"ID":"0637fa1d-347d-4772-b199-4542ce66909c","Type":"ContainerDied","Data":"73da9b5c5d0f3dea994abb2c4b0ece6f304d51b1816084ebdcee1a472e5aae6a"} Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.042789 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.069429 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f"] Dec 02 09:27:36 crc kubenswrapper[4781]: E1202 09:27:36.069681 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" containerName="extract-utilities" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.069695 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" containerName="extract-utilities" Dec 02 09:27:36 crc kubenswrapper[4781]: E1202 09:27:36.069708 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" containerName="extract-content" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.069716 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" containerName="extract-content" Dec 02 09:27:36 crc kubenswrapper[4781]: E1202 09:27:36.069726 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" containerName="registry-server" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.069733 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" containerName="registry-server" Dec 02 09:27:36 crc kubenswrapper[4781]: E1202 09:27:36.069748 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0637fa1d-347d-4772-b199-4542ce66909c" containerName="route-controller-manager" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.069753 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0637fa1d-347d-4772-b199-4542ce66909c" containerName="route-controller-manager" Dec 02 09:27:36 crc kubenswrapper[4781]: E1202 09:27:36.069764 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" containerName="registry-server" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.069770 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" containerName="registry-server" Dec 02 09:27:36 crc kubenswrapper[4781]: E1202 09:27:36.069778 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" containerName="extract-utilities" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.069784 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" containerName="extract-utilities" Dec 02 09:27:36 crc kubenswrapper[4781]: E1202 09:27:36.069794 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" containerName="extract-content" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.069799 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" containerName="extract-content" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.069883 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a0d43df-4480-4b62-bd3d-d129fbdcd722" containerName="registry-server" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.069900 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0637fa1d-347d-4772-b199-4542ce66909c" containerName="route-controller-manager" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.069908 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f37a1838-a130-4cb4-807f-2214cbfefdbf" containerName="registry-server" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.070283 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.083229 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f"] Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.190914 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0637fa1d-347d-4772-b199-4542ce66909c-client-ca\") pod \"0637fa1d-347d-4772-b199-4542ce66909c\" (UID: \"0637fa1d-347d-4772-b199-4542ce66909c\") " Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.191036 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0637fa1d-347d-4772-b199-4542ce66909c-serving-cert\") pod \"0637fa1d-347d-4772-b199-4542ce66909c\" (UID: \"0637fa1d-347d-4772-b199-4542ce66909c\") " Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.191115 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt2qt\" (UniqueName: \"kubernetes.io/projected/0637fa1d-347d-4772-b199-4542ce66909c-kube-api-access-wt2qt\") pod \"0637fa1d-347d-4772-b199-4542ce66909c\" (UID: \"0637fa1d-347d-4772-b199-4542ce66909c\") " Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.191147 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0637fa1d-347d-4772-b199-4542ce66909c-config\") pod \"0637fa1d-347d-4772-b199-4542ce66909c\" (UID: \"0637fa1d-347d-4772-b199-4542ce66909c\") " Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.191337 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaa197f5-e446-49ad-9b27-27ee170daa61-config\") pod \"route-controller-manager-77fd659786-z9p9f\" (UID: \"eaa197f5-e446-49ad-9b27-27ee170daa61\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.191462 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqz6c\" (UniqueName: \"kubernetes.io/projected/eaa197f5-e446-49ad-9b27-27ee170daa61-kube-api-access-nqz6c\") pod \"route-controller-manager-77fd659786-z9p9f\" (UID: \"eaa197f5-e446-49ad-9b27-27ee170daa61\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.191495 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eaa197f5-e446-49ad-9b27-27ee170daa61-serving-cert\") pod \"route-controller-manager-77fd659786-z9p9f\" (UID: \"eaa197f5-e446-49ad-9b27-27ee170daa61\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.191534 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eaa197f5-e446-49ad-9b27-27ee170daa61-client-ca\") pod \"route-controller-manager-77fd659786-z9p9f\" (UID: \"eaa197f5-e446-49ad-9b27-27ee170daa61\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.192068 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0637fa1d-347d-4772-b199-4542ce66909c-config" (OuterVolumeSpecName: "config") pod "0637fa1d-347d-4772-b199-4542ce66909c" (UID: "0637fa1d-347d-4772-b199-4542ce66909c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.192118 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0637fa1d-347d-4772-b199-4542ce66909c-client-ca" (OuterVolumeSpecName: "client-ca") pod "0637fa1d-347d-4772-b199-4542ce66909c" (UID: "0637fa1d-347d-4772-b199-4542ce66909c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.195831 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0637fa1d-347d-4772-b199-4542ce66909c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0637fa1d-347d-4772-b199-4542ce66909c" (UID: "0637fa1d-347d-4772-b199-4542ce66909c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.195849 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0637fa1d-347d-4772-b199-4542ce66909c-kube-api-access-wt2qt" (OuterVolumeSpecName: "kube-api-access-wt2qt") pod "0637fa1d-347d-4772-b199-4542ce66909c" (UID: "0637fa1d-347d-4772-b199-4542ce66909c"). InnerVolumeSpecName "kube-api-access-wt2qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.293012 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqz6c\" (UniqueName: \"kubernetes.io/projected/eaa197f5-e446-49ad-9b27-27ee170daa61-kube-api-access-nqz6c\") pod \"route-controller-manager-77fd659786-z9p9f\" (UID: \"eaa197f5-e446-49ad-9b27-27ee170daa61\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.293053 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eaa197f5-e446-49ad-9b27-27ee170daa61-serving-cert\") pod \"route-controller-manager-77fd659786-z9p9f\" (UID: \"eaa197f5-e446-49ad-9b27-27ee170daa61\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.293085 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eaa197f5-e446-49ad-9b27-27ee170daa61-client-ca\") pod \"route-controller-manager-77fd659786-z9p9f\" (UID: \"eaa197f5-e446-49ad-9b27-27ee170daa61\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.293113 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaa197f5-e446-49ad-9b27-27ee170daa61-config\") pod \"route-controller-manager-77fd659786-z9p9f\" (UID: \"eaa197f5-e446-49ad-9b27-27ee170daa61\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.293173 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0637fa1d-347d-4772-b199-4542ce66909c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.293184 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt2qt\" (UniqueName: \"kubernetes.io/projected/0637fa1d-347d-4772-b199-4542ce66909c-kube-api-access-wt2qt\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.293196 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0637fa1d-347d-4772-b199-4542ce66909c-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.293204 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0637fa1d-347d-4772-b199-4542ce66909c-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.294297 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eaa197f5-e446-49ad-9b27-27ee170daa61-client-ca\") pod \"route-controller-manager-77fd659786-z9p9f\" (UID: \"eaa197f5-e446-49ad-9b27-27ee170daa61\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.294361 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaa197f5-e446-49ad-9b27-27ee170daa61-config\") pod \"route-controller-manager-77fd659786-z9p9f\" (UID: \"eaa197f5-e446-49ad-9b27-27ee170daa61\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.298417 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eaa197f5-e446-49ad-9b27-27ee170daa61-serving-cert\") pod \"route-controller-manager-77fd659786-z9p9f\" (UID: \"eaa197f5-e446-49ad-9b27-27ee170daa61\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.311384 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqz6c\" (UniqueName: \"kubernetes.io/projected/eaa197f5-e446-49ad-9b27-27ee170daa61-kube-api-access-nqz6c\") pod \"route-controller-manager-77fd659786-z9p9f\" (UID: \"eaa197f5-e446-49ad-9b27-27ee170daa61\") " pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.388371 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.788171 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f"] Dec 02 09:27:36 crc kubenswrapper[4781]: W1202 09:27:36.792890 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaa197f5_e446_49ad_9b27_27ee170daa61.slice/crio-5f9390a39d98080d841a50556e1ac07deb94da0640a5942d51f772d5c579a316 WatchSource:0}: Error finding container 5f9390a39d98080d841a50556e1ac07deb94da0640a5942d51f772d5c579a316: Status 404 returned error can't find the container with id 5f9390a39d98080d841a50556e1ac07deb94da0640a5942d51f772d5c579a316 Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.867069 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" event={"ID":"0637fa1d-347d-4772-b199-4542ce66909c","Type":"ContainerDied","Data":"ee02049efbcd0ff004b7ba56cc6c7de1b63e774f5eb931c3e38f883b59df1aa2"} Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.867147 4781 scope.go:117] "RemoveContainer" containerID="73da9b5c5d0f3dea994abb2c4b0ece6f304d51b1816084ebdcee1a472e5aae6a" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.868015 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2" Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.868972 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" event={"ID":"eaa197f5-e446-49ad-9b27-27ee170daa61","Type":"ContainerStarted","Data":"5f9390a39d98080d841a50556e1ac07deb94da0640a5942d51f772d5c579a316"} Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.895608 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2"] Dec 02 09:27:36 crc kubenswrapper[4781]: I1202 09:27:36.902120 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-785fdfb9bc-x98w2"] Dec 02 09:27:37 crc kubenswrapper[4781]: I1202 09:27:37.507269 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0637fa1d-347d-4772-b199-4542ce66909c" path="/var/lib/kubelet/pods/0637fa1d-347d-4772-b199-4542ce66909c/volumes" Dec 02 09:27:37 crc kubenswrapper[4781]: I1202 09:27:37.881482 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" event={"ID":"eaa197f5-e446-49ad-9b27-27ee170daa61","Type":"ContainerStarted","Data":"61c886317cf7e8d6d01a2e346da5f7f787933447753a35817162d9d084169704"} Dec 02 09:27:37 crc kubenswrapper[4781]: I1202 09:27:37.882006 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" Dec 02 09:27:37 crc kubenswrapper[4781]: I1202 09:27:37.887596 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" Dec 02 09:27:37 crc kubenswrapper[4781]: I1202 09:27:37.903169 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-77fd659786-z9p9f" podStartSLOduration=3.903148354 podStartE2EDuration="3.903148354s" podCreationTimestamp="2025-12-02 09:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:27:37.898653523 +0000 UTC m=+420.722527412" watchObservedRunningTime="2025-12-02 09:27:37.903148354 +0000 UTC m=+420.727022233" Dec 02 09:27:44 crc kubenswrapper[4781]: I1202 09:27:44.113868 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-g6kvc" Dec 02 09:27:44 crc kubenswrapper[4781]: I1202 09:27:44.191649 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s95hx"] Dec 02 09:27:54 crc kubenswrapper[4781]: I1202 09:27:54.566048 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c8985447-lrspd"] Dec 02 09:27:54 crc kubenswrapper[4781]: I1202 09:27:54.566769 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" podUID="8b6358d4-1fac-471b-b465-74f1531f87a1" containerName="controller-manager" containerID="cri-o://ee8e5a5fff3ebebc79f516f255f6a0b3e5a677f0c98b916c1a5325c19f9ec9a6" gracePeriod=30 Dec 02 09:27:57 crc kubenswrapper[4781]: I1202 09:27:57.001072 4781 generic.go:334] "Generic (PLEG): container finished" podID="8b6358d4-1fac-471b-b465-74f1531f87a1" containerID="ee8e5a5fff3ebebc79f516f255f6a0b3e5a677f0c98b916c1a5325c19f9ec9a6" exitCode=0 Dec 02 09:27:57 crc kubenswrapper[4781]: I1202 09:27:57.001169 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" event={"ID":"8b6358d4-1fac-471b-b465-74f1531f87a1","Type":"ContainerDied","Data":"ee8e5a5fff3ebebc79f516f255f6a0b3e5a677f0c98b916c1a5325c19f9ec9a6"} Dec 02 09:27:57 crc kubenswrapper[4781]: I1202 09:27:57.877180 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:57 crc kubenswrapper[4781]: I1202 09:27:57.909991 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9"] Dec 02 09:27:57 crc kubenswrapper[4781]: E1202 09:27:57.910223 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6358d4-1fac-471b-b465-74f1531f87a1" containerName="controller-manager" Dec 02 09:27:57 crc kubenswrapper[4781]: I1202 09:27:57.910238 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6358d4-1fac-471b-b465-74f1531f87a1" containerName="controller-manager" Dec 02 09:27:57 crc kubenswrapper[4781]: I1202 09:27:57.910372 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6358d4-1fac-471b-b465-74f1531f87a1" containerName="controller-manager" Dec 02 09:27:57 crc kubenswrapper[4781]: I1202 09:27:57.910761 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:57 crc kubenswrapper[4781]: I1202 09:27:57.930136 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9"] Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.004779 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9wxp\" (UniqueName: \"kubernetes.io/projected/8b6358d4-1fac-471b-b465-74f1531f87a1-kube-api-access-w9wxp\") pod \"8b6358d4-1fac-471b-b465-74f1531f87a1\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.004820 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-client-ca\") pod \"8b6358d4-1fac-471b-b465-74f1531f87a1\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.004853 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-proxy-ca-bundles\") pod \"8b6358d4-1fac-471b-b465-74f1531f87a1\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.004886 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b6358d4-1fac-471b-b465-74f1531f87a1-serving-cert\") pod \"8b6358d4-1fac-471b-b465-74f1531f87a1\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.004941 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-config\") pod \"8b6358d4-1fac-471b-b465-74f1531f87a1\" (UID: \"8b6358d4-1fac-471b-b465-74f1531f87a1\") " Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.005136 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10e00d86-1ab4-4fb3-b368-52b133e6b553-proxy-ca-bundles\") pod \"controller-manager-7bc5fc4fbc-rjlj9\" (UID: \"10e00d86-1ab4-4fb3-b368-52b133e6b553\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.005169 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10e00d86-1ab4-4fb3-b368-52b133e6b553-serving-cert\") pod \"controller-manager-7bc5fc4fbc-rjlj9\" (UID: \"10e00d86-1ab4-4fb3-b368-52b133e6b553\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.005212 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbvjl\" (UniqueName: \"kubernetes.io/projected/10e00d86-1ab4-4fb3-b368-52b133e6b553-kube-api-access-xbvjl\") pod \"controller-manager-7bc5fc4fbc-rjlj9\" (UID: \"10e00d86-1ab4-4fb3-b368-52b133e6b553\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.005276 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10e00d86-1ab4-4fb3-b368-52b133e6b553-client-ca\") pod \"controller-manager-7bc5fc4fbc-rjlj9\" (UID: \"10e00d86-1ab4-4fb3-b368-52b133e6b553\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.005306 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10e00d86-1ab4-4fb3-b368-52b133e6b553-config\") pod \"controller-manager-7bc5fc4fbc-rjlj9\" (UID: \"10e00d86-1ab4-4fb3-b368-52b133e6b553\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.006245 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-client-ca" (OuterVolumeSpecName: "client-ca") pod "8b6358d4-1fac-471b-b465-74f1531f87a1" (UID: "8b6358d4-1fac-471b-b465-74f1531f87a1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.006259 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8b6358d4-1fac-471b-b465-74f1531f87a1" (UID: "8b6358d4-1fac-471b-b465-74f1531f87a1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.006422 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-config" (OuterVolumeSpecName: "config") pod "8b6358d4-1fac-471b-b465-74f1531f87a1" (UID: "8b6358d4-1fac-471b-b465-74f1531f87a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.008943 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" event={"ID":"8b6358d4-1fac-471b-b465-74f1531f87a1","Type":"ContainerDied","Data":"3a352510e8e4e7a4a845d3dfdd70be5eb4f5649b29fc61b0ab78166f018e7ef5"} Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.008991 4781 scope.go:117] "RemoveContainer" containerID="ee8e5a5fff3ebebc79f516f255f6a0b3e5a677f0c98b916c1a5325c19f9ec9a6" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.008987 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c8985447-lrspd" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.011353 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b6358d4-1fac-471b-b465-74f1531f87a1-kube-api-access-w9wxp" (OuterVolumeSpecName: "kube-api-access-w9wxp") pod "8b6358d4-1fac-471b-b465-74f1531f87a1" (UID: "8b6358d4-1fac-471b-b465-74f1531f87a1"). InnerVolumeSpecName "kube-api-access-w9wxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.014028 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b6358d4-1fac-471b-b465-74f1531f87a1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8b6358d4-1fac-471b-b465-74f1531f87a1" (UID: "8b6358d4-1fac-471b-b465-74f1531f87a1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.106030 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10e00d86-1ab4-4fb3-b368-52b133e6b553-config\") pod \"controller-manager-7bc5fc4fbc-rjlj9\" (UID: \"10e00d86-1ab4-4fb3-b368-52b133e6b553\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.106076 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10e00d86-1ab4-4fb3-b368-52b133e6b553-proxy-ca-bundles\") pod \"controller-manager-7bc5fc4fbc-rjlj9\" (UID: \"10e00d86-1ab4-4fb3-b368-52b133e6b553\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.106099 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10e00d86-1ab4-4fb3-b368-52b133e6b553-serving-cert\") pod \"controller-manager-7bc5fc4fbc-rjlj9\" (UID: \"10e00d86-1ab4-4fb3-b368-52b133e6b553\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.106134 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbvjl\" (UniqueName: \"kubernetes.io/projected/10e00d86-1ab4-4fb3-b368-52b133e6b553-kube-api-access-xbvjl\") pod \"controller-manager-7bc5fc4fbc-rjlj9\" (UID: \"10e00d86-1ab4-4fb3-b368-52b133e6b553\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.106178 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10e00d86-1ab4-4fb3-b368-52b133e6b553-client-ca\") pod \"controller-manager-7bc5fc4fbc-rjlj9\" (UID: \"10e00d86-1ab4-4fb3-b368-52b133e6b553\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.106841 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.107021 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9wxp\" (UniqueName: \"kubernetes.io/projected/8b6358d4-1fac-471b-b465-74f1531f87a1-kube-api-access-w9wxp\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.107042 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.107057 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b6358d4-1fac-471b-b465-74f1531f87a1-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.107069 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b6358d4-1fac-471b-b465-74f1531f87a1-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.108085 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10e00d86-1ab4-4fb3-b368-52b133e6b553-client-ca\") pod \"controller-manager-7bc5fc4fbc-rjlj9\" (UID: \"10e00d86-1ab4-4fb3-b368-52b133e6b553\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.108381 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10e00d86-1ab4-4fb3-b368-52b133e6b553-config\") pod \"controller-manager-7bc5fc4fbc-rjlj9\" (UID: \"10e00d86-1ab4-4fb3-b368-52b133e6b553\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.109435 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10e00d86-1ab4-4fb3-b368-52b133e6b553-proxy-ca-bundles\") pod \"controller-manager-7bc5fc4fbc-rjlj9\" (UID: \"10e00d86-1ab4-4fb3-b368-52b133e6b553\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.110346 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10e00d86-1ab4-4fb3-b368-52b133e6b553-serving-cert\") pod \"controller-manager-7bc5fc4fbc-rjlj9\" (UID: \"10e00d86-1ab4-4fb3-b368-52b133e6b553\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.125501 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbvjl\" (UniqueName: \"kubernetes.io/projected/10e00d86-1ab4-4fb3-b368-52b133e6b553-kube-api-access-xbvjl\") pod \"controller-manager-7bc5fc4fbc-rjlj9\" (UID: \"10e00d86-1ab4-4fb3-b368-52b133e6b553\") " pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.230948 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.347093 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c8985447-lrspd"] Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.350665 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5c8985447-lrspd"] Dec 02 09:27:58 crc kubenswrapper[4781]: W1202 09:27:58.630989 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10e00d86_1ab4_4fb3_b368_52b133e6b553.slice/crio-06e781efe04da0a7f79a2debf5ed703db4ca461a0d903409d4b909fd0a35bf1f WatchSource:0}: Error finding container 06e781efe04da0a7f79a2debf5ed703db4ca461a0d903409d4b909fd0a35bf1f: Status 404 returned error can't find the container with id 06e781efe04da0a7f79a2debf5ed703db4ca461a0d903409d4b909fd0a35bf1f Dec 02 09:27:58 crc kubenswrapper[4781]: I1202 09:27:58.636985 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9"] Dec 02 09:27:59 crc kubenswrapper[4781]: I1202 09:27:59.022866 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" event={"ID":"10e00d86-1ab4-4fb3-b368-52b133e6b553","Type":"ContainerStarted","Data":"0976731de2c965962909268b7a27d1cded0c75f7085a8c8320ac790089b095a8"} Dec 02 09:27:59 crc kubenswrapper[4781]: I1202 09:27:59.022913 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" event={"ID":"10e00d86-1ab4-4fb3-b368-52b133e6b553","Type":"ContainerStarted","Data":"06e781efe04da0a7f79a2debf5ed703db4ca461a0d903409d4b909fd0a35bf1f"} Dec 02 09:27:59 crc kubenswrapper[4781]: I1202 09:27:59.024103 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:59 crc kubenswrapper[4781]: I1202 09:27:59.029565 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" Dec 02 09:27:59 crc kubenswrapper[4781]: I1202 09:27:59.054030 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7bc5fc4fbc-rjlj9" podStartSLOduration=5.054014327 podStartE2EDuration="5.054014327s" podCreationTimestamp="2025-12-02 09:27:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:27:59.052587109 +0000 UTC m=+441.876460988" watchObservedRunningTime="2025-12-02 09:27:59.054014327 +0000 UTC m=+441.877888206" Dec 02 09:27:59 crc kubenswrapper[4781]: I1202 09:27:59.506176 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b6358d4-1fac-471b-b465-74f1531f87a1" path="/var/lib/kubelet/pods/8b6358d4-1fac-471b-b465-74f1531f87a1/volumes" Dec 02 09:28:00 crc kubenswrapper[4781]: I1202 09:28:00.412596 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:28:00 crc kubenswrapper[4781]: I1202 09:28:00.412664 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:28:00 crc kubenswrapper[4781]: I1202 09:28:00.412714 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:28:00 crc kubenswrapper[4781]: I1202 09:28:00.413269 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ebcdebd6c3481dcb7b7c492991920a861406e047665bbf99c0df4f3ff4534e5b"} pod="openshift-machine-config-operator/machine-config-daemon-pzntm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:28:00 crc kubenswrapper[4781]: I1202 09:28:00.413335 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" containerID="cri-o://ebcdebd6c3481dcb7b7c492991920a861406e047665bbf99c0df4f3ff4534e5b" gracePeriod=600 Dec 02 09:28:01 crc kubenswrapper[4781]: I1202 09:28:01.034376 4781 generic.go:334] "Generic (PLEG): container finished" podID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerID="ebcdebd6c3481dcb7b7c492991920a861406e047665bbf99c0df4f3ff4534e5b" exitCode=0 Dec 02 09:28:01 crc kubenswrapper[4781]: I1202 09:28:01.034616 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerDied","Data":"ebcdebd6c3481dcb7b7c492991920a861406e047665bbf99c0df4f3ff4534e5b"} Dec 02 09:28:01 crc kubenswrapper[4781]: I1202 09:28:01.035427 4781 scope.go:117] "RemoveContainer" containerID="5c19e02f947d0d75971d14d988b7c3119365b6ba5348290ee0e23e115e14780f" Dec 02 09:28:01 crc kubenswrapper[4781]: I1202 09:28:01.894560 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-844r4"] Dec 02 09:28:01 crc kubenswrapper[4781]: I1202 09:28:01.895339 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-844r4" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" containerName="registry-server" containerID="cri-o://88a12b393ba226860379c973a8763acbf05e3bb790f5d35eec7ad9ddd42b07d9" gracePeriod=30 Dec 02 09:28:01 crc kubenswrapper[4781]: I1202 09:28:01.912509 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ntjsd"] Dec 02 09:28:01 crc kubenswrapper[4781]: I1202 09:28:01.912726 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ntjsd" podUID="9170fc18-624b-4358-b931-7e889eee7317" containerName="registry-server" containerID="cri-o://76e2b55fa1a2f93fe2fd7baff904c209ec95260b27aadfa4157e969ff570bd0c" gracePeriod=30 Dec 02 09:28:01 crc kubenswrapper[4781]: I1202 09:28:01.917423 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x4lq4"] Dec 02 09:28:01 crc kubenswrapper[4781]: I1202 09:28:01.917642 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" podUID="cef8c4dd-6e0f-44c9-8926-72b0d821f823" containerName="marketplace-operator" containerID="cri-o://4dfee10dbde5c137a7daf25b50dba3c4683df2fbb5072195fc1215648984567e" gracePeriod=30 Dec 02 09:28:01 crc kubenswrapper[4781]: I1202 09:28:01.920175 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pgk8"] Dec 02 09:28:01 crc kubenswrapper[4781]: I1202 09:28:01.920568 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8pgk8" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" containerName="registry-server" containerID="cri-o://1822aaa0b48e249d15a786a4f2d85c035ab1f28c209a58ddf4da57e8438e4e56" gracePeriod=30 Dec 02 09:28:01 crc kubenswrapper[4781]: I1202 09:28:01.927001 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lx2rp"] Dec 02 09:28:01 crc kubenswrapper[4781]: I1202 09:28:01.935386 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lx2rp" Dec 02 09:28:01 crc kubenswrapper[4781]: I1202 09:28:01.950973 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9bkjh"] Dec 02 09:28:01 crc kubenswrapper[4781]: I1202 09:28:01.951284 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9bkjh" podUID="2bbd8e20-1afd-4e4a-a689-e77ae2042bfb" containerName="registry-server" containerID="cri-o://03cc7cb7bff43f5e8d2a752a3be22d227008d5b26e1f58a433b6922572fcfe30" gracePeriod=30 Dec 02 09:28:01 crc kubenswrapper[4781]: I1202 09:28:01.955753 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lx2rp"] Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.043099 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"b4ee060e26e3e3e4831e7dd0ea5a0ab99f2012d5edf1e6348cbe970647009c50"} Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.046599 4781 generic.go:334] "Generic (PLEG): container finished" podID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" containerID="88a12b393ba226860379c973a8763acbf05e3bb790f5d35eec7ad9ddd42b07d9" exitCode=0 Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.046646 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-844r4" event={"ID":"b28a19d5-e49e-46f0-942d-dc9f96777c2d","Type":"ContainerDied","Data":"88a12b393ba226860379c973a8763acbf05e3bb790f5d35eec7ad9ddd42b07d9"} Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.062582 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/060db2b9-0086-4429-8ade-2156f94455f4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lx2rp\" (UID: \"060db2b9-0086-4429-8ade-2156f94455f4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lx2rp" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.062650 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkzjk\" (UniqueName: \"kubernetes.io/projected/060db2b9-0086-4429-8ade-2156f94455f4-kube-api-access-fkzjk\") pod \"marketplace-operator-79b997595-lx2rp\" (UID: \"060db2b9-0086-4429-8ade-2156f94455f4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lx2rp" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.062672 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/060db2b9-0086-4429-8ade-2156f94455f4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lx2rp\" (UID: \"060db2b9-0086-4429-8ade-2156f94455f4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lx2rp" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.163611 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/060db2b9-0086-4429-8ade-2156f94455f4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lx2rp\" (UID: \"060db2b9-0086-4429-8ade-2156f94455f4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lx2rp" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.163704 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkzjk\" (UniqueName: \"kubernetes.io/projected/060db2b9-0086-4429-8ade-2156f94455f4-kube-api-access-fkzjk\") pod \"marketplace-operator-79b997595-lx2rp\" (UID: \"060db2b9-0086-4429-8ade-2156f94455f4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lx2rp" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.163738 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/060db2b9-0086-4429-8ade-2156f94455f4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lx2rp\" (UID: \"060db2b9-0086-4429-8ade-2156f94455f4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lx2rp" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.165866 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/060db2b9-0086-4429-8ade-2156f94455f4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lx2rp\" (UID: \"060db2b9-0086-4429-8ade-2156f94455f4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lx2rp" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.173669 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/060db2b9-0086-4429-8ade-2156f94455f4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lx2rp\" (UID: \"060db2b9-0086-4429-8ade-2156f94455f4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lx2rp" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.185357 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkzjk\" (UniqueName: \"kubernetes.io/projected/060db2b9-0086-4429-8ade-2156f94455f4-kube-api-access-fkzjk\") pod \"marketplace-operator-79b997595-lx2rp\" (UID: \"060db2b9-0086-4429-8ade-2156f94455f4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lx2rp" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.274900 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lx2rp" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.433022 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.524975 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9bkjh" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.531151 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pgk8" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.565061 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ntjsd" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.569449 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cef8c4dd-6e0f-44c9-8926-72b0d821f823-marketplace-operator-metrics\") pod \"cef8c4dd-6e0f-44c9-8926-72b0d821f823\" (UID: \"cef8c4dd-6e0f-44c9-8926-72b0d821f823\") " Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.569533 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cef8c4dd-6e0f-44c9-8926-72b0d821f823-marketplace-trusted-ca\") pod \"cef8c4dd-6e0f-44c9-8926-72b0d821f823\" (UID: \"cef8c4dd-6e0f-44c9-8926-72b0d821f823\") " Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.569583 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22pgh\" (UniqueName: \"kubernetes.io/projected/cef8c4dd-6e0f-44c9-8926-72b0d821f823-kube-api-access-22pgh\") pod \"cef8c4dd-6e0f-44c9-8926-72b0d821f823\" (UID: \"cef8c4dd-6e0f-44c9-8926-72b0d821f823\") " Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.572212 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef8c4dd-6e0f-44c9-8926-72b0d821f823-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "cef8c4dd-6e0f-44c9-8926-72b0d821f823" (UID: "cef8c4dd-6e0f-44c9-8926-72b0d821f823"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.574398 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef8c4dd-6e0f-44c9-8926-72b0d821f823-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "cef8c4dd-6e0f-44c9-8926-72b0d821f823" (UID: "cef8c4dd-6e0f-44c9-8926-72b0d821f823"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.575453 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef8c4dd-6e0f-44c9-8926-72b0d821f823-kube-api-access-22pgh" (OuterVolumeSpecName: "kube-api-access-22pgh") pod "cef8c4dd-6e0f-44c9-8926-72b0d821f823" (UID: "cef8c4dd-6e0f-44c9-8926-72b0d821f823"). InnerVolumeSpecName "kube-api-access-22pgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.670858 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-catalog-content\") pod \"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb\" (UID: \"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb\") " Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.670973 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4734fd7-42d6-4b87-9160-5a3471f91d03-catalog-content\") pod \"c4734fd7-42d6-4b87-9160-5a3471f91d03\" (UID: \"c4734fd7-42d6-4b87-9160-5a3471f91d03\") " Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.671019 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpxf4\" (UniqueName: \"kubernetes.io/projected/c4734fd7-42d6-4b87-9160-5a3471f91d03-kube-api-access-cpxf4\") pod \"c4734fd7-42d6-4b87-9160-5a3471f91d03\" (UID: \"c4734fd7-42d6-4b87-9160-5a3471f91d03\") " Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.671042 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz7qg\" (UniqueName: \"kubernetes.io/projected/9170fc18-624b-4358-b931-7e889eee7317-kube-api-access-nz7qg\") pod \"9170fc18-624b-4358-b931-7e889eee7317\" (UID: \"9170fc18-624b-4358-b931-7e889eee7317\") " Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.671121 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7wbd\" (UniqueName: \"kubernetes.io/projected/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-kube-api-access-n7wbd\") pod \"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb\" (UID: \"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb\") " Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.671150 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9170fc18-624b-4358-b931-7e889eee7317-utilities\") pod \"9170fc18-624b-4358-b931-7e889eee7317\" (UID: \"9170fc18-624b-4358-b931-7e889eee7317\") " Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.671185 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-utilities\") pod \"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb\" (UID: \"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb\") " Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.671218 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9170fc18-624b-4358-b931-7e889eee7317-catalog-content\") pod \"9170fc18-624b-4358-b931-7e889eee7317\" (UID: \"9170fc18-624b-4358-b931-7e889eee7317\") " Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.671234 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4734fd7-42d6-4b87-9160-5a3471f91d03-utilities\") pod \"c4734fd7-42d6-4b87-9160-5a3471f91d03\" (UID: \"c4734fd7-42d6-4b87-9160-5a3471f91d03\") " Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.671519 4781 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cef8c4dd-6e0f-44c9-8926-72b0d821f823-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.671540 4781 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cef8c4dd-6e0f-44c9-8926-72b0d821f823-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.671549 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22pgh\" (UniqueName: \"kubernetes.io/projected/cef8c4dd-6e0f-44c9-8926-72b0d821f823-kube-api-access-22pgh\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.672168 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4734fd7-42d6-4b87-9160-5a3471f91d03-utilities" (OuterVolumeSpecName: "utilities") pod "c4734fd7-42d6-4b87-9160-5a3471f91d03" (UID: "c4734fd7-42d6-4b87-9160-5a3471f91d03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.675853 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9170fc18-624b-4358-b931-7e889eee7317-utilities" (OuterVolumeSpecName: "utilities") pod "9170fc18-624b-4358-b931-7e889eee7317" (UID: "9170fc18-624b-4358-b931-7e889eee7317"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.676171 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-utilities" (OuterVolumeSpecName: "utilities") pod "2bbd8e20-1afd-4e4a-a689-e77ae2042bfb" (UID: "2bbd8e20-1afd-4e4a-a689-e77ae2042bfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.678240 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4734fd7-42d6-4b87-9160-5a3471f91d03-kube-api-access-cpxf4" (OuterVolumeSpecName: "kube-api-access-cpxf4") pod "c4734fd7-42d6-4b87-9160-5a3471f91d03" (UID: "c4734fd7-42d6-4b87-9160-5a3471f91d03"). InnerVolumeSpecName "kube-api-access-cpxf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.687536 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9170fc18-624b-4358-b931-7e889eee7317-kube-api-access-nz7qg" (OuterVolumeSpecName: "kube-api-access-nz7qg") pod "9170fc18-624b-4358-b931-7e889eee7317" (UID: "9170fc18-624b-4358-b931-7e889eee7317"). InnerVolumeSpecName "kube-api-access-nz7qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.694822 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4734fd7-42d6-4b87-9160-5a3471f91d03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4734fd7-42d6-4b87-9160-5a3471f91d03" (UID: "c4734fd7-42d6-4b87-9160-5a3471f91d03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.695465 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-kube-api-access-n7wbd" (OuterVolumeSpecName: "kube-api-access-n7wbd") pod "2bbd8e20-1afd-4e4a-a689-e77ae2042bfb" (UID: "2bbd8e20-1afd-4e4a-a689-e77ae2042bfb"). InnerVolumeSpecName "kube-api-access-n7wbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.734292 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lx2rp"] Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.738532 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9170fc18-624b-4358-b931-7e889eee7317-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9170fc18-624b-4358-b931-7e889eee7317" (UID: "9170fc18-624b-4358-b931-7e889eee7317"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.773018 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7wbd\" (UniqueName: \"kubernetes.io/projected/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-kube-api-access-n7wbd\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.773054 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9170fc18-624b-4358-b931-7e889eee7317-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.773096 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.773107 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9170fc18-624b-4358-b931-7e889eee7317-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.773118 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4734fd7-42d6-4b87-9160-5a3471f91d03-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.773131 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4734fd7-42d6-4b87-9160-5a3471f91d03-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.773141 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpxf4\" (UniqueName: \"kubernetes.io/projected/c4734fd7-42d6-4b87-9160-5a3471f91d03-kube-api-access-cpxf4\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.773154 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz7qg\" (UniqueName: \"kubernetes.io/projected/9170fc18-624b-4358-b931-7e889eee7317-kube-api-access-nz7qg\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.778271 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bbd8e20-1afd-4e4a-a689-e77ae2042bfb" (UID: "2bbd8e20-1afd-4e4a-a689-e77ae2042bfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:28:02 crc kubenswrapper[4781]: I1202 09:28:02.874267 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.053238 4781 generic.go:334] "Generic (PLEG): container finished" podID="9170fc18-624b-4358-b931-7e889eee7317" containerID="76e2b55fa1a2f93fe2fd7baff904c209ec95260b27aadfa4157e969ff570bd0c" exitCode=0 Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.053296 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ntjsd" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.053286 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntjsd" event={"ID":"9170fc18-624b-4358-b931-7e889eee7317","Type":"ContainerDied","Data":"76e2b55fa1a2f93fe2fd7baff904c209ec95260b27aadfa4157e969ff570bd0c"} Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.053774 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntjsd" event={"ID":"9170fc18-624b-4358-b931-7e889eee7317","Type":"ContainerDied","Data":"bf04ace9f66f9598a823a6b8ba25e7d4fe21d2190232688a005a8eafbe4613b7"} Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.053810 4781 scope.go:117] "RemoveContainer" containerID="76e2b55fa1a2f93fe2fd7baff904c209ec95260b27aadfa4157e969ff570bd0c" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.055504 4781 generic.go:334] "Generic (PLEG): container finished" podID="cef8c4dd-6e0f-44c9-8926-72b0d821f823" containerID="4dfee10dbde5c137a7daf25b50dba3c4683df2fbb5072195fc1215648984567e" exitCode=0 Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.055620 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.055628 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" event={"ID":"cef8c4dd-6e0f-44c9-8926-72b0d821f823","Type":"ContainerDied","Data":"4dfee10dbde5c137a7daf25b50dba3c4683df2fbb5072195fc1215648984567e"} Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.055854 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-x4lq4" event={"ID":"cef8c4dd-6e0f-44c9-8926-72b0d821f823","Type":"ContainerDied","Data":"6f1dc40bcf8b212a603ef52c7b16bbcd516a6d70c3c596eda85949d8d4401539"} Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.059256 4781 generic.go:334] "Generic (PLEG): container finished" podID="c4734fd7-42d6-4b87-9160-5a3471f91d03" containerID="1822aaa0b48e249d15a786a4f2d85c035ab1f28c209a58ddf4da57e8438e4e56" exitCode=0 Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.059332 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pgk8" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.059333 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pgk8" event={"ID":"c4734fd7-42d6-4b87-9160-5a3471f91d03","Type":"ContainerDied","Data":"1822aaa0b48e249d15a786a4f2d85c035ab1f28c209a58ddf4da57e8438e4e56"} Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.059482 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pgk8" event={"ID":"c4734fd7-42d6-4b87-9160-5a3471f91d03","Type":"ContainerDied","Data":"ce89cef24bcfcae7d6cfdd8e92e1aaff06cda7b76730583203906365f0fff65f"} Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.060960 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lx2rp" event={"ID":"060db2b9-0086-4429-8ade-2156f94455f4","Type":"ContainerStarted","Data":"7c4211d5e9373ffe8fc861dd8a7547190bab9bcf35918b5d16acbc0c46793de6"} Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.064039 4781 generic.go:334] "Generic (PLEG): container finished" podID="2bbd8e20-1afd-4e4a-a689-e77ae2042bfb" containerID="03cc7cb7bff43f5e8d2a752a3be22d227008d5b26e1f58a433b6922572fcfe30" exitCode=0 Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.064376 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bkjh" event={"ID":"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb","Type":"ContainerDied","Data":"03cc7cb7bff43f5e8d2a752a3be22d227008d5b26e1f58a433b6922572fcfe30"} Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.064418 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bkjh" event={"ID":"2bbd8e20-1afd-4e4a-a689-e77ae2042bfb","Type":"ContainerDied","Data":"1f9d5157e08eb9b77ba0efda8e2238ec5fa609460eca9cb4d943ec1fcfebad58"} Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.064564 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9bkjh" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.079533 4781 scope.go:117] "RemoveContainer" containerID="52456d0680b8c19b41afd79907b2e6286093b7509fa7f000ee9a6c3cd09b8480" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.094692 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ntjsd"] Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.100848 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ntjsd"] Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.107063 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9bkjh"] Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.110815 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9bkjh"] Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.121359 4781 scope.go:117] "RemoveContainer" containerID="4a80fef1afee3a1e76fb280ef9fc157055dec329cd6c0e90013685465a99f0e2" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.127545 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x4lq4"] Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.132576 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x4lq4"] Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.137484 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pgk8"] Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.139049 4781 scope.go:117] "RemoveContainer" containerID="76e2b55fa1a2f93fe2fd7baff904c209ec95260b27aadfa4157e969ff570bd0c" Dec 02 09:28:03 crc kubenswrapper[4781]: E1202 09:28:03.139526 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76e2b55fa1a2f93fe2fd7baff904c209ec95260b27aadfa4157e969ff570bd0c\": container with ID starting with 76e2b55fa1a2f93fe2fd7baff904c209ec95260b27aadfa4157e969ff570bd0c not found: ID does not exist" containerID="76e2b55fa1a2f93fe2fd7baff904c209ec95260b27aadfa4157e969ff570bd0c" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.139554 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76e2b55fa1a2f93fe2fd7baff904c209ec95260b27aadfa4157e969ff570bd0c"} err="failed to get container status \"76e2b55fa1a2f93fe2fd7baff904c209ec95260b27aadfa4157e969ff570bd0c\": rpc error: code = NotFound desc = could not find container \"76e2b55fa1a2f93fe2fd7baff904c209ec95260b27aadfa4157e969ff570bd0c\": container with ID starting with 76e2b55fa1a2f93fe2fd7baff904c209ec95260b27aadfa4157e969ff570bd0c not found: ID does not exist" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.139574 4781 scope.go:117] "RemoveContainer" containerID="52456d0680b8c19b41afd79907b2e6286093b7509fa7f000ee9a6c3cd09b8480" Dec 02 09:28:03 crc kubenswrapper[4781]: E1202 09:28:03.140002 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52456d0680b8c19b41afd79907b2e6286093b7509fa7f000ee9a6c3cd09b8480\": container with ID starting with 52456d0680b8c19b41afd79907b2e6286093b7509fa7f000ee9a6c3cd09b8480 not found: ID does not exist" containerID="52456d0680b8c19b41afd79907b2e6286093b7509fa7f000ee9a6c3cd09b8480" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.140026 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52456d0680b8c19b41afd79907b2e6286093b7509fa7f000ee9a6c3cd09b8480"} err="failed to get container status \"52456d0680b8c19b41afd79907b2e6286093b7509fa7f000ee9a6c3cd09b8480\": rpc error: code = NotFound desc = could not find container \"52456d0680b8c19b41afd79907b2e6286093b7509fa7f000ee9a6c3cd09b8480\": container with ID starting with 52456d0680b8c19b41afd79907b2e6286093b7509fa7f000ee9a6c3cd09b8480 not found: ID does not exist" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.140038 4781 scope.go:117] "RemoveContainer" containerID="4a80fef1afee3a1e76fb280ef9fc157055dec329cd6c0e90013685465a99f0e2" Dec 02 09:28:03 crc kubenswrapper[4781]: E1202 09:28:03.140484 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a80fef1afee3a1e76fb280ef9fc157055dec329cd6c0e90013685465a99f0e2\": container with ID starting with 4a80fef1afee3a1e76fb280ef9fc157055dec329cd6c0e90013685465a99f0e2 not found: ID does not exist" containerID="4a80fef1afee3a1e76fb280ef9fc157055dec329cd6c0e90013685465a99f0e2" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.140522 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a80fef1afee3a1e76fb280ef9fc157055dec329cd6c0e90013685465a99f0e2"} err="failed to get container status \"4a80fef1afee3a1e76fb280ef9fc157055dec329cd6c0e90013685465a99f0e2\": rpc error: code = NotFound desc = could not find container \"4a80fef1afee3a1e76fb280ef9fc157055dec329cd6c0e90013685465a99f0e2\": container with ID starting with 4a80fef1afee3a1e76fb280ef9fc157055dec329cd6c0e90013685465a99f0e2 not found: ID does not exist" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.140557 4781 scope.go:117] "RemoveContainer" containerID="4dfee10dbde5c137a7daf25b50dba3c4683df2fbb5072195fc1215648984567e" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.141817 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pgk8"] Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.163879 4781 scope.go:117] "RemoveContainer" containerID="4dfee10dbde5c137a7daf25b50dba3c4683df2fbb5072195fc1215648984567e" Dec 02 09:28:03 crc kubenswrapper[4781]: E1202 09:28:03.164429 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dfee10dbde5c137a7daf25b50dba3c4683df2fbb5072195fc1215648984567e\": container with ID starting with 4dfee10dbde5c137a7daf25b50dba3c4683df2fbb5072195fc1215648984567e not found: ID does not exist" containerID="4dfee10dbde5c137a7daf25b50dba3c4683df2fbb5072195fc1215648984567e" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.164468 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dfee10dbde5c137a7daf25b50dba3c4683df2fbb5072195fc1215648984567e"} err="failed to get container status \"4dfee10dbde5c137a7daf25b50dba3c4683df2fbb5072195fc1215648984567e\": rpc error: code = NotFound desc = could not find container \"4dfee10dbde5c137a7daf25b50dba3c4683df2fbb5072195fc1215648984567e\": container with ID starting with 4dfee10dbde5c137a7daf25b50dba3c4683df2fbb5072195fc1215648984567e not found: ID does not exist" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.164495 4781 scope.go:117] "RemoveContainer" containerID="1822aaa0b48e249d15a786a4f2d85c035ab1f28c209a58ddf4da57e8438e4e56" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.177423 4781 scope.go:117] "RemoveContainer" containerID="ec9535118e1e08b778718f3770ed8a9a99709a23f3d4ed856e36a491596ff221" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.202159 4781 scope.go:117] "RemoveContainer" containerID="f6f9a042e9592cdf08cd9d8005e2ff9e5ebfad0b53ecab4fb805644810bde95b" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.226966 4781 scope.go:117] "RemoveContainer" containerID="1822aaa0b48e249d15a786a4f2d85c035ab1f28c209a58ddf4da57e8438e4e56" Dec 02 09:28:03 crc kubenswrapper[4781]: E1202 09:28:03.229325 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1822aaa0b48e249d15a786a4f2d85c035ab1f28c209a58ddf4da57e8438e4e56\": container with ID starting with 1822aaa0b48e249d15a786a4f2d85c035ab1f28c209a58ddf4da57e8438e4e56 not found: ID does not exist" containerID="1822aaa0b48e249d15a786a4f2d85c035ab1f28c209a58ddf4da57e8438e4e56" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.229372 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1822aaa0b48e249d15a786a4f2d85c035ab1f28c209a58ddf4da57e8438e4e56"} err="failed to get container status \"1822aaa0b48e249d15a786a4f2d85c035ab1f28c209a58ddf4da57e8438e4e56\": rpc error: code = NotFound desc = could not find container \"1822aaa0b48e249d15a786a4f2d85c035ab1f28c209a58ddf4da57e8438e4e56\": container with ID starting with 1822aaa0b48e249d15a786a4f2d85c035ab1f28c209a58ddf4da57e8438e4e56 not found: ID does not exist" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.229404 4781 scope.go:117] "RemoveContainer" containerID="ec9535118e1e08b778718f3770ed8a9a99709a23f3d4ed856e36a491596ff221" Dec 02 09:28:03 crc kubenswrapper[4781]: E1202 09:28:03.229781 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec9535118e1e08b778718f3770ed8a9a99709a23f3d4ed856e36a491596ff221\": container with ID starting with ec9535118e1e08b778718f3770ed8a9a99709a23f3d4ed856e36a491596ff221 not found: ID does not exist" containerID="ec9535118e1e08b778718f3770ed8a9a99709a23f3d4ed856e36a491596ff221" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.229810 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec9535118e1e08b778718f3770ed8a9a99709a23f3d4ed856e36a491596ff221"} err="failed to get container status \"ec9535118e1e08b778718f3770ed8a9a99709a23f3d4ed856e36a491596ff221\": rpc error: code = NotFound desc = could not find container \"ec9535118e1e08b778718f3770ed8a9a99709a23f3d4ed856e36a491596ff221\": container with ID starting with ec9535118e1e08b778718f3770ed8a9a99709a23f3d4ed856e36a491596ff221 not found: ID does not exist" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.229829 4781 scope.go:117] "RemoveContainer" containerID="f6f9a042e9592cdf08cd9d8005e2ff9e5ebfad0b53ecab4fb805644810bde95b" Dec 02 09:28:03 crc kubenswrapper[4781]: E1202 09:28:03.230070 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6f9a042e9592cdf08cd9d8005e2ff9e5ebfad0b53ecab4fb805644810bde95b\": container with ID starting with f6f9a042e9592cdf08cd9d8005e2ff9e5ebfad0b53ecab4fb805644810bde95b not found: ID does not exist" containerID="f6f9a042e9592cdf08cd9d8005e2ff9e5ebfad0b53ecab4fb805644810bde95b" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.230100 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f9a042e9592cdf08cd9d8005e2ff9e5ebfad0b53ecab4fb805644810bde95b"} err="failed to get container status \"f6f9a042e9592cdf08cd9d8005e2ff9e5ebfad0b53ecab4fb805644810bde95b\": rpc error: code = NotFound desc = could not find container \"f6f9a042e9592cdf08cd9d8005e2ff9e5ebfad0b53ecab4fb805644810bde95b\": container with ID starting with f6f9a042e9592cdf08cd9d8005e2ff9e5ebfad0b53ecab4fb805644810bde95b not found: ID does not exist" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.230117 4781 scope.go:117] "RemoveContainer" containerID="03cc7cb7bff43f5e8d2a752a3be22d227008d5b26e1f58a433b6922572fcfe30" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.241671 4781 scope.go:117] "RemoveContainer" containerID="33003d9ac151d48b316de8bbaaf0bafeca1564b76155bb50d3cf06adb72ccdcf" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.257560 4781 scope.go:117] "RemoveContainer" containerID="804feecb036d22d9ae44fac571d5a85343565569ffd73c8a73effa532bf707f6" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.268005 4781 scope.go:117] "RemoveContainer" containerID="03cc7cb7bff43f5e8d2a752a3be22d227008d5b26e1f58a433b6922572fcfe30" Dec 02 09:28:03 crc kubenswrapper[4781]: E1202 09:28:03.268441 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03cc7cb7bff43f5e8d2a752a3be22d227008d5b26e1f58a433b6922572fcfe30\": container with ID starting with 03cc7cb7bff43f5e8d2a752a3be22d227008d5b26e1f58a433b6922572fcfe30 not found: ID does not exist" containerID="03cc7cb7bff43f5e8d2a752a3be22d227008d5b26e1f58a433b6922572fcfe30" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.268553 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03cc7cb7bff43f5e8d2a752a3be22d227008d5b26e1f58a433b6922572fcfe30"} err="failed to get container status \"03cc7cb7bff43f5e8d2a752a3be22d227008d5b26e1f58a433b6922572fcfe30\": rpc error: code = NotFound desc = could not find container \"03cc7cb7bff43f5e8d2a752a3be22d227008d5b26e1f58a433b6922572fcfe30\": container with ID starting with 03cc7cb7bff43f5e8d2a752a3be22d227008d5b26e1f58a433b6922572fcfe30 not found: ID does not exist" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.268636 4781 scope.go:117] "RemoveContainer" containerID="33003d9ac151d48b316de8bbaaf0bafeca1564b76155bb50d3cf06adb72ccdcf" Dec 02 09:28:03 crc kubenswrapper[4781]: E1202 09:28:03.269000 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33003d9ac151d48b316de8bbaaf0bafeca1564b76155bb50d3cf06adb72ccdcf\": container with ID starting with 33003d9ac151d48b316de8bbaaf0bafeca1564b76155bb50d3cf06adb72ccdcf not found: ID does not exist" containerID="33003d9ac151d48b316de8bbaaf0bafeca1564b76155bb50d3cf06adb72ccdcf" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.269029 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33003d9ac151d48b316de8bbaaf0bafeca1564b76155bb50d3cf06adb72ccdcf"} err="failed to get container status \"33003d9ac151d48b316de8bbaaf0bafeca1564b76155bb50d3cf06adb72ccdcf\": rpc error: code = NotFound desc = could not find container \"33003d9ac151d48b316de8bbaaf0bafeca1564b76155bb50d3cf06adb72ccdcf\": container with ID starting with 33003d9ac151d48b316de8bbaaf0bafeca1564b76155bb50d3cf06adb72ccdcf not found: ID does not exist" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.269048 4781 scope.go:117] "RemoveContainer" containerID="804feecb036d22d9ae44fac571d5a85343565569ffd73c8a73effa532bf707f6" Dec 02 09:28:03 crc kubenswrapper[4781]: E1202 09:28:03.269716 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"804feecb036d22d9ae44fac571d5a85343565569ffd73c8a73effa532bf707f6\": container with ID starting with 804feecb036d22d9ae44fac571d5a85343565569ffd73c8a73effa532bf707f6 not found: ID does not exist" containerID="804feecb036d22d9ae44fac571d5a85343565569ffd73c8a73effa532bf707f6" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.269765 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804feecb036d22d9ae44fac571d5a85343565569ffd73c8a73effa532bf707f6"} err="failed to get container status \"804feecb036d22d9ae44fac571d5a85343565569ffd73c8a73effa532bf707f6\": rpc error: code = NotFound desc = could not find container \"804feecb036d22d9ae44fac571d5a85343565569ffd73c8a73effa532bf707f6\": container with ID starting with 804feecb036d22d9ae44fac571d5a85343565569ffd73c8a73effa532bf707f6 not found: ID does not exist" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.506420 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bbd8e20-1afd-4e4a-a689-e77ae2042bfb" path="/var/lib/kubelet/pods/2bbd8e20-1afd-4e4a-a689-e77ae2042bfb/volumes" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.507500 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9170fc18-624b-4358-b931-7e889eee7317" path="/var/lib/kubelet/pods/9170fc18-624b-4358-b931-7e889eee7317/volumes" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.508260 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" path="/var/lib/kubelet/pods/c4734fd7-42d6-4b87-9160-5a3471f91d03/volumes" Dec 02 09:28:03 crc kubenswrapper[4781]: I1202 09:28:03.509594 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cef8c4dd-6e0f-44c9-8926-72b0d821f823" path="/var/lib/kubelet/pods/cef8c4dd-6e0f-44c9-8926-72b0d821f823/volumes" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.114909 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6s4mn"] Dec 02 09:28:04 crc kubenswrapper[4781]: E1202 09:28:04.115188 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bbd8e20-1afd-4e4a-a689-e77ae2042bfb" containerName="extract-utilities" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.115205 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bbd8e20-1afd-4e4a-a689-e77ae2042bfb" containerName="extract-utilities" Dec 02 09:28:04 crc kubenswrapper[4781]: E1202 09:28:04.115216 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9170fc18-624b-4358-b931-7e889eee7317" containerName="registry-server" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.115224 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9170fc18-624b-4358-b931-7e889eee7317" containerName="registry-server" Dec 02 09:28:04 crc kubenswrapper[4781]: E1202 09:28:04.115235 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bbd8e20-1afd-4e4a-a689-e77ae2042bfb" containerName="registry-server" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.115243 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bbd8e20-1afd-4e4a-a689-e77ae2042bfb" containerName="registry-server" Dec 02 09:28:04 crc kubenswrapper[4781]: E1202 09:28:04.115252 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9170fc18-624b-4358-b931-7e889eee7317" containerName="extract-utilities" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.115259 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9170fc18-624b-4358-b931-7e889eee7317" containerName="extract-utilities" Dec 02 09:28:04 crc kubenswrapper[4781]: E1202 09:28:04.115268 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef8c4dd-6e0f-44c9-8926-72b0d821f823" containerName="marketplace-operator" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.115275 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef8c4dd-6e0f-44c9-8926-72b0d821f823" containerName="marketplace-operator" Dec 02 09:28:04 crc kubenswrapper[4781]: E1202 09:28:04.115284 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" containerName="extract-utilities" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.115291 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" containerName="extract-utilities" Dec 02 09:28:04 crc kubenswrapper[4781]: E1202 09:28:04.115299 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bbd8e20-1afd-4e4a-a689-e77ae2042bfb" containerName="extract-content" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.115307 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bbd8e20-1afd-4e4a-a689-e77ae2042bfb" containerName="extract-content" Dec 02 09:28:04 crc kubenswrapper[4781]: E1202 09:28:04.115319 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9170fc18-624b-4358-b931-7e889eee7317" containerName="extract-content" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.115327 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9170fc18-624b-4358-b931-7e889eee7317" containerName="extract-content" Dec 02 09:28:04 crc kubenswrapper[4781]: E1202 09:28:04.115339 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" containerName="extract-content" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.115347 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" containerName="extract-content" Dec 02 09:28:04 crc kubenswrapper[4781]: E1202 09:28:04.115357 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" containerName="registry-server" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.115366 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" containerName="registry-server" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.115476 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef8c4dd-6e0f-44c9-8926-72b0d821f823" containerName="marketplace-operator" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.115496 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4734fd7-42d6-4b87-9160-5a3471f91d03" containerName="registry-server" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.115513 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bbd8e20-1afd-4e4a-a689-e77ae2042bfb" containerName="registry-server" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.115520 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9170fc18-624b-4358-b931-7e889eee7317" containerName="registry-server" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.116451 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6s4mn" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.119070 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.130105 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6s4mn"] Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.189680 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1996ded9-7aca-4ca6-b787-1c688c678893-catalog-content\") pod \"redhat-marketplace-6s4mn\" (UID: \"1996ded9-7aca-4ca6-b787-1c688c678893\") " pod="openshift-marketplace/redhat-marketplace-6s4mn" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.189765 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1996ded9-7aca-4ca6-b787-1c688c678893-utilities\") pod \"redhat-marketplace-6s4mn\" (UID: \"1996ded9-7aca-4ca6-b787-1c688c678893\") " pod="openshift-marketplace/redhat-marketplace-6s4mn" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.189837 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96b97\" (UniqueName: \"kubernetes.io/projected/1996ded9-7aca-4ca6-b787-1c688c678893-kube-api-access-96b97\") pod \"redhat-marketplace-6s4mn\" (UID: \"1996ded9-7aca-4ca6-b787-1c688c678893\") " pod="openshift-marketplace/redhat-marketplace-6s4mn" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.296739 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1996ded9-7aca-4ca6-b787-1c688c678893-utilities\") pod \"redhat-marketplace-6s4mn\" (UID: \"1996ded9-7aca-4ca6-b787-1c688c678893\") " pod="openshift-marketplace/redhat-marketplace-6s4mn" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.297433 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1996ded9-7aca-4ca6-b787-1c688c678893-utilities\") pod \"redhat-marketplace-6s4mn\" (UID: \"1996ded9-7aca-4ca6-b787-1c688c678893\") " pod="openshift-marketplace/redhat-marketplace-6s4mn" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.300599 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96b97\" (UniqueName: \"kubernetes.io/projected/1996ded9-7aca-4ca6-b787-1c688c678893-kube-api-access-96b97\") pod \"redhat-marketplace-6s4mn\" (UID: \"1996ded9-7aca-4ca6-b787-1c688c678893\") " pod="openshift-marketplace/redhat-marketplace-6s4mn" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.300680 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1996ded9-7aca-4ca6-b787-1c688c678893-catalog-content\") pod \"redhat-marketplace-6s4mn\" (UID: \"1996ded9-7aca-4ca6-b787-1c688c678893\") " pod="openshift-marketplace/redhat-marketplace-6s4mn" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.301109 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1996ded9-7aca-4ca6-b787-1c688c678893-catalog-content\") pod \"redhat-marketplace-6s4mn\" (UID: \"1996ded9-7aca-4ca6-b787-1c688c678893\") " pod="openshift-marketplace/redhat-marketplace-6s4mn" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.310026 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w2wcs"] Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.311193 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2wcs" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.329790 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.330179 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96b97\" (UniqueName: \"kubernetes.io/projected/1996ded9-7aca-4ca6-b787-1c688c678893-kube-api-access-96b97\") pod \"redhat-marketplace-6s4mn\" (UID: \"1996ded9-7aca-4ca6-b787-1c688c678893\") " pod="openshift-marketplace/redhat-marketplace-6s4mn" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.333464 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w2wcs"] Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.402462 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4-catalog-content\") pod \"redhat-operators-w2wcs\" (UID: \"5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4\") " pod="openshift-marketplace/redhat-operators-w2wcs" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.402740 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4-utilities\") pod \"redhat-operators-w2wcs\" (UID: \"5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4\") " pod="openshift-marketplace/redhat-operators-w2wcs" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.402768 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ndt9\" (UniqueName: \"kubernetes.io/projected/5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4-kube-api-access-7ndt9\") pod \"redhat-operators-w2wcs\" (UID: \"5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4\") " pod="openshift-marketplace/redhat-operators-w2wcs" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.484954 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6s4mn" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.504518 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4-utilities\") pod \"redhat-operators-w2wcs\" (UID: \"5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4\") " pod="openshift-marketplace/redhat-operators-w2wcs" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.504575 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ndt9\" (UniqueName: \"kubernetes.io/projected/5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4-kube-api-access-7ndt9\") pod \"redhat-operators-w2wcs\" (UID: \"5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4\") " pod="openshift-marketplace/redhat-operators-w2wcs" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.504609 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4-catalog-content\") pod \"redhat-operators-w2wcs\" (UID: \"5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4\") " pod="openshift-marketplace/redhat-operators-w2wcs" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.505011 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4-utilities\") pod \"redhat-operators-w2wcs\" (UID: \"5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4\") " pod="openshift-marketplace/redhat-operators-w2wcs" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.507777 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4-catalog-content\") pod \"redhat-operators-w2wcs\" (UID: \"5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4\") " pod="openshift-marketplace/redhat-operators-w2wcs" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.531845 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ndt9\" (UniqueName: \"kubernetes.io/projected/5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4-kube-api-access-7ndt9\") pod \"redhat-operators-w2wcs\" (UID: \"5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4\") " pod="openshift-marketplace/redhat-operators-w2wcs" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.570178 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-844r4" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.605468 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87j5w\" (UniqueName: \"kubernetes.io/projected/b28a19d5-e49e-46f0-942d-dc9f96777c2d-kube-api-access-87j5w\") pod \"b28a19d5-e49e-46f0-942d-dc9f96777c2d\" (UID: \"b28a19d5-e49e-46f0-942d-dc9f96777c2d\") " Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.605533 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b28a19d5-e49e-46f0-942d-dc9f96777c2d-utilities\") pod \"b28a19d5-e49e-46f0-942d-dc9f96777c2d\" (UID: \"b28a19d5-e49e-46f0-942d-dc9f96777c2d\") " Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.605633 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b28a19d5-e49e-46f0-942d-dc9f96777c2d-catalog-content\") pod \"b28a19d5-e49e-46f0-942d-dc9f96777c2d\" (UID: \"b28a19d5-e49e-46f0-942d-dc9f96777c2d\") " Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.612793 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b28a19d5-e49e-46f0-942d-dc9f96777c2d-kube-api-access-87j5w" (OuterVolumeSpecName: "kube-api-access-87j5w") pod "b28a19d5-e49e-46f0-942d-dc9f96777c2d" (UID: "b28a19d5-e49e-46f0-942d-dc9f96777c2d"). InnerVolumeSpecName "kube-api-access-87j5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.613054 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b28a19d5-e49e-46f0-942d-dc9f96777c2d-utilities" (OuterVolumeSpecName: "utilities") pod "b28a19d5-e49e-46f0-942d-dc9f96777c2d" (UID: "b28a19d5-e49e-46f0-942d-dc9f96777c2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.675068 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2wcs" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.690194 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b28a19d5-e49e-46f0-942d-dc9f96777c2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b28a19d5-e49e-46f0-942d-dc9f96777c2d" (UID: "b28a19d5-e49e-46f0-942d-dc9f96777c2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.707386 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b28a19d5-e49e-46f0-942d-dc9f96777c2d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.707425 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87j5w\" (UniqueName: \"kubernetes.io/projected/b28a19d5-e49e-46f0-942d-dc9f96777c2d-kube-api-access-87j5w\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.707437 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b28a19d5-e49e-46f0-942d-dc9f96777c2d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:04 crc kubenswrapper[4781]: I1202 09:28:04.930043 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6s4mn"] Dec 02 09:28:04 crc kubenswrapper[4781]: W1202 09:28:04.941633 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1996ded9_7aca_4ca6_b787_1c688c678893.slice/crio-956f7f2dfba207b17810827a01f663c65123a4e3484c5d4700c67fe9a7a71ca5 WatchSource:0}: Error finding container 956f7f2dfba207b17810827a01f663c65123a4e3484c5d4700c67fe9a7a71ca5: Status 404 returned error can't find the container with id 956f7f2dfba207b17810827a01f663c65123a4e3484c5d4700c67fe9a7a71ca5 Dec 02 09:28:05 crc kubenswrapper[4781]: I1202 09:28:05.086072 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6s4mn" event={"ID":"1996ded9-7aca-4ca6-b787-1c688c678893","Type":"ContainerStarted","Data":"956f7f2dfba207b17810827a01f663c65123a4e3484c5d4700c67fe9a7a71ca5"} Dec 02 09:28:05 crc kubenswrapper[4781]: I1202 09:28:05.087681 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w2wcs"] Dec 02 09:28:05 crc kubenswrapper[4781]: I1202 09:28:05.089957 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-844r4" Dec 02 09:28:05 crc kubenswrapper[4781]: I1202 09:28:05.089950 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-844r4" event={"ID":"b28a19d5-e49e-46f0-942d-dc9f96777c2d","Type":"ContainerDied","Data":"8edb6c67ecc9174be59ae3ab68b4da1d959f1b977722c7bd0d6abbb15e91b6da"} Dec 02 09:28:05 crc kubenswrapper[4781]: I1202 09:28:05.090106 4781 scope.go:117] "RemoveContainer" containerID="88a12b393ba226860379c973a8763acbf05e3bb790f5d35eec7ad9ddd42b07d9" Dec 02 09:28:05 crc kubenswrapper[4781]: I1202 09:28:05.099257 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lx2rp" event={"ID":"060db2b9-0086-4429-8ade-2156f94455f4","Type":"ContainerStarted","Data":"0cb9d17e3fcdf76e35d96a931c734b94b72ef9ed2fd4fed0597c182e360202d3"} Dec 02 09:28:05 crc kubenswrapper[4781]: I1202 09:28:05.099516 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-lx2rp" Dec 02 09:28:05 crc kubenswrapper[4781]: I1202 09:28:05.103612 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-lx2rp" Dec 02 09:28:05 crc kubenswrapper[4781]: I1202 09:28:05.116481 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-lx2rp" podStartSLOduration=4.116456752 podStartE2EDuration="4.116456752s" podCreationTimestamp="2025-12-02 09:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:28:05.11304892 +0000 UTC m=+447.936922789" watchObservedRunningTime="2025-12-02 09:28:05.116456752 +0000 UTC m=+447.940330641" Dec 02 09:28:05 crc kubenswrapper[4781]: I1202 09:28:05.125534 4781 scope.go:117] "RemoveContainer" containerID="9434a0fe428189ccb4720a2f27db8c4863ab8ce8e708e6acaf092375da20f15b" Dec 02 09:28:05 crc kubenswrapper[4781]: I1202 09:28:05.153128 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-844r4"] Dec 02 09:28:05 crc kubenswrapper[4781]: I1202 09:28:05.157004 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-844r4"] Dec 02 09:28:05 crc kubenswrapper[4781]: I1202 09:28:05.168950 4781 scope.go:117] "RemoveContainer" containerID="2a618bd42c0bddbc25e8bd8117042482cdec79b549874b67493a989aaad70345" Dec 02 09:28:05 crc kubenswrapper[4781]: I1202 09:28:05.507026 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" path="/var/lib/kubelet/pods/b28a19d5-e49e-46f0-942d-dc9f96777c2d/volumes" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.106389 4781 generic.go:334] "Generic (PLEG): container finished" podID="5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4" containerID="79defd31eb9026d1a086a475612a22adcfcc7c4a0a918dea49f4f4ed371fa9d2" exitCode=0 Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.106739 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2wcs" event={"ID":"5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4","Type":"ContainerDied","Data":"79defd31eb9026d1a086a475612a22adcfcc7c4a0a918dea49f4f4ed371fa9d2"} Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.106763 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2wcs" event={"ID":"5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4","Type":"ContainerStarted","Data":"6e5efcd76053fe2ab85adcb70af633270a2776b1faa52612e5cb5a2ae8a03730"} Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.108117 4781 generic.go:334] "Generic (PLEG): container finished" podID="1996ded9-7aca-4ca6-b787-1c688c678893" containerID="87cb51e133c282db24427ad8bdb069f1cd99d09c3f5d120c2b0fd96cf00dc9ae" exitCode=0 Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.108156 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6s4mn" event={"ID":"1996ded9-7aca-4ca6-b787-1c688c678893","Type":"ContainerDied","Data":"87cb51e133c282db24427ad8bdb069f1cd99d09c3f5d120c2b0fd96cf00dc9ae"} Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.511210 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tzs7n"] Dec 02 09:28:06 crc kubenswrapper[4781]: E1202 09:28:06.511466 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" containerName="extract-utilities" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.511483 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" containerName="extract-utilities" Dec 02 09:28:06 crc kubenswrapper[4781]: E1202 09:28:06.511504 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" containerName="registry-server" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.511515 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" containerName="registry-server" Dec 02 09:28:06 crc kubenswrapper[4781]: E1202 09:28:06.511544 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" containerName="extract-content" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.511556 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" containerName="extract-content" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.511802 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b28a19d5-e49e-46f0-942d-dc9f96777c2d" containerName="registry-server" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.520295 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tzs7n"] Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.520407 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzs7n" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.523209 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.531051 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14c3d9a9-9948-4be9-86a3-9def6cce4450-catalog-content\") pod \"certified-operators-tzs7n\" (UID: \"14c3d9a9-9948-4be9-86a3-9def6cce4450\") " pod="openshift-marketplace/certified-operators-tzs7n" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.531117 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14c3d9a9-9948-4be9-86a3-9def6cce4450-utilities\") pod \"certified-operators-tzs7n\" (UID: \"14c3d9a9-9948-4be9-86a3-9def6cce4450\") " pod="openshift-marketplace/certified-operators-tzs7n" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.531199 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wktj4\" (UniqueName: \"kubernetes.io/projected/14c3d9a9-9948-4be9-86a3-9def6cce4450-kube-api-access-wktj4\") pod \"certified-operators-tzs7n\" (UID: \"14c3d9a9-9948-4be9-86a3-9def6cce4450\") " pod="openshift-marketplace/certified-operators-tzs7n" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.632560 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14c3d9a9-9948-4be9-86a3-9def6cce4450-catalog-content\") pod \"certified-operators-tzs7n\" (UID: \"14c3d9a9-9948-4be9-86a3-9def6cce4450\") " pod="openshift-marketplace/certified-operators-tzs7n" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.632624 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14c3d9a9-9948-4be9-86a3-9def6cce4450-utilities\") pod \"certified-operators-tzs7n\" (UID: \"14c3d9a9-9948-4be9-86a3-9def6cce4450\") " pod="openshift-marketplace/certified-operators-tzs7n" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.632681 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wktj4\" (UniqueName: \"kubernetes.io/projected/14c3d9a9-9948-4be9-86a3-9def6cce4450-kube-api-access-wktj4\") pod \"certified-operators-tzs7n\" (UID: \"14c3d9a9-9948-4be9-86a3-9def6cce4450\") " pod="openshift-marketplace/certified-operators-tzs7n" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.633131 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14c3d9a9-9948-4be9-86a3-9def6cce4450-catalog-content\") pod \"certified-operators-tzs7n\" (UID: \"14c3d9a9-9948-4be9-86a3-9def6cce4450\") " pod="openshift-marketplace/certified-operators-tzs7n" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.633328 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14c3d9a9-9948-4be9-86a3-9def6cce4450-utilities\") pod \"certified-operators-tzs7n\" (UID: \"14c3d9a9-9948-4be9-86a3-9def6cce4450\") " pod="openshift-marketplace/certified-operators-tzs7n" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.658288 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wktj4\" (UniqueName: \"kubernetes.io/projected/14c3d9a9-9948-4be9-86a3-9def6cce4450-kube-api-access-wktj4\") pod \"certified-operators-tzs7n\" (UID: \"14c3d9a9-9948-4be9-86a3-9def6cce4450\") " pod="openshift-marketplace/certified-operators-tzs7n" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.710581 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kbzzx"] Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.711730 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbzzx" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.714772 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.719465 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kbzzx"] Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.734056 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af2414f-0f9f-418b-b807-4362bf6ee700-utilities\") pod \"community-operators-kbzzx\" (UID: \"2af2414f-0f9f-418b-b807-4362bf6ee700\") " pod="openshift-marketplace/community-operators-kbzzx" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.734130 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af2414f-0f9f-418b-b807-4362bf6ee700-catalog-content\") pod \"community-operators-kbzzx\" (UID: \"2af2414f-0f9f-418b-b807-4362bf6ee700\") " pod="openshift-marketplace/community-operators-kbzzx" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.734231 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtfnk\" (UniqueName: \"kubernetes.io/projected/2af2414f-0f9f-418b-b807-4362bf6ee700-kube-api-access-gtfnk\") pod \"community-operators-kbzzx\" (UID: \"2af2414f-0f9f-418b-b807-4362bf6ee700\") " pod="openshift-marketplace/community-operators-kbzzx" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.835011 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af2414f-0f9f-418b-b807-4362bf6ee700-utilities\") pod \"community-operators-kbzzx\" (UID: \"2af2414f-0f9f-418b-b807-4362bf6ee700\") " pod="openshift-marketplace/community-operators-kbzzx" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.835095 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af2414f-0f9f-418b-b807-4362bf6ee700-catalog-content\") pod \"community-operators-kbzzx\" (UID: \"2af2414f-0f9f-418b-b807-4362bf6ee700\") " pod="openshift-marketplace/community-operators-kbzzx" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.835138 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtfnk\" (UniqueName: \"kubernetes.io/projected/2af2414f-0f9f-418b-b807-4362bf6ee700-kube-api-access-gtfnk\") pod \"community-operators-kbzzx\" (UID: \"2af2414f-0f9f-418b-b807-4362bf6ee700\") " pod="openshift-marketplace/community-operators-kbzzx" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.835539 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af2414f-0f9f-418b-b807-4362bf6ee700-utilities\") pod \"community-operators-kbzzx\" (UID: \"2af2414f-0f9f-418b-b807-4362bf6ee700\") " pod="openshift-marketplace/community-operators-kbzzx" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.835578 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af2414f-0f9f-418b-b807-4362bf6ee700-catalog-content\") pod \"community-operators-kbzzx\" (UID: \"2af2414f-0f9f-418b-b807-4362bf6ee700\") " pod="openshift-marketplace/community-operators-kbzzx" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.846938 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzs7n" Dec 02 09:28:06 crc kubenswrapper[4781]: I1202 09:28:06.852328 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtfnk\" (UniqueName: \"kubernetes.io/projected/2af2414f-0f9f-418b-b807-4362bf6ee700-kube-api-access-gtfnk\") pod \"community-operators-kbzzx\" (UID: \"2af2414f-0f9f-418b-b807-4362bf6ee700\") " pod="openshift-marketplace/community-operators-kbzzx" Dec 02 09:28:07 crc kubenswrapper[4781]: I1202 09:28:07.040562 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbzzx" Dec 02 09:28:07 crc kubenswrapper[4781]: I1202 09:28:07.251422 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tzs7n"] Dec 02 09:28:07 crc kubenswrapper[4781]: W1202 09:28:07.256193 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14c3d9a9_9948_4be9_86a3_9def6cce4450.slice/crio-79e40c8c56afe289cd55ddd6eb3e6c60fc704ee207d9eeda445332b48a2e137d WatchSource:0}: Error finding container 79e40c8c56afe289cd55ddd6eb3e6c60fc704ee207d9eeda445332b48a2e137d: Status 404 returned error can't find the container with id 79e40c8c56afe289cd55ddd6eb3e6c60fc704ee207d9eeda445332b48a2e137d Dec 02 09:28:07 crc kubenswrapper[4781]: I1202 09:28:07.430882 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kbzzx"] Dec 02 09:28:07 crc kubenswrapper[4781]: W1202 09:28:07.437943 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2af2414f_0f9f_418b_b807_4362bf6ee700.slice/crio-15e36dc6a647f1b089d80cc23fe6b048035af785b3d001bf7895081916dc3b15 WatchSource:0}: Error finding container 15e36dc6a647f1b089d80cc23fe6b048035af785b3d001bf7895081916dc3b15: Status 404 returned error can't find the container with id 15e36dc6a647f1b089d80cc23fe6b048035af785b3d001bf7895081916dc3b15 Dec 02 09:28:08 crc kubenswrapper[4781]: I1202 09:28:08.121239 4781 generic.go:334] "Generic (PLEG): container finished" podID="2af2414f-0f9f-418b-b807-4362bf6ee700" containerID="27d630f68eab5f05abb6a19fee812276154e64a860c5fb747d060e5b0a939587" exitCode=0 Dec 02 09:28:08 crc kubenswrapper[4781]: I1202 09:28:08.121280 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbzzx" event={"ID":"2af2414f-0f9f-418b-b807-4362bf6ee700","Type":"ContainerDied","Data":"27d630f68eab5f05abb6a19fee812276154e64a860c5fb747d060e5b0a939587"} Dec 02 09:28:08 crc kubenswrapper[4781]: I1202 09:28:08.121582 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbzzx" event={"ID":"2af2414f-0f9f-418b-b807-4362bf6ee700","Type":"ContainerStarted","Data":"15e36dc6a647f1b089d80cc23fe6b048035af785b3d001bf7895081916dc3b15"} Dec 02 09:28:08 crc kubenswrapper[4781]: I1202 09:28:08.123949 4781 generic.go:334] "Generic (PLEG): container finished" podID="14c3d9a9-9948-4be9-86a3-9def6cce4450" containerID="1c1fd6c79e7935b6d5186463318a686aa33d68009044bceae44d10290d59e4e7" exitCode=0 Dec 02 09:28:08 crc kubenswrapper[4781]: I1202 09:28:08.123979 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzs7n" event={"ID":"14c3d9a9-9948-4be9-86a3-9def6cce4450","Type":"ContainerDied","Data":"1c1fd6c79e7935b6d5186463318a686aa33d68009044bceae44d10290d59e4e7"} Dec 02 09:28:08 crc kubenswrapper[4781]: I1202 09:28:08.123998 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzs7n" event={"ID":"14c3d9a9-9948-4be9-86a3-9def6cce4450","Type":"ContainerStarted","Data":"79e40c8c56afe289cd55ddd6eb3e6c60fc704ee207d9eeda445332b48a2e137d"} Dec 02 09:28:09 crc kubenswrapper[4781]: I1202 09:28:09.130506 4781 generic.go:334] "Generic (PLEG): container finished" podID="1996ded9-7aca-4ca6-b787-1c688c678893" containerID="c7ca78242271d3df64cf247e0083ea9c2a01e0cdd7dc84b06efead67fced5812" exitCode=0 Dec 02 09:28:09 crc kubenswrapper[4781]: I1202 09:28:09.131784 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6s4mn" event={"ID":"1996ded9-7aca-4ca6-b787-1c688c678893","Type":"ContainerDied","Data":"c7ca78242271d3df64cf247e0083ea9c2a01e0cdd7dc84b06efead67fced5812"} Dec 02 09:28:09 crc kubenswrapper[4781]: I1202 09:28:09.132848 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 09:28:09 crc kubenswrapper[4781]: I1202 09:28:09.256082 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" podUID="da84b89e-515f-4595-badf-a13b1ce0342a" containerName="registry" containerID="cri-o://d6f328b5a6c1718f5bab6bf30e2e97c2d89f117bd4bd787c424b55bb82f11bab" gracePeriod=30 Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.142217 4781 generic.go:334] "Generic (PLEG): container finished" podID="da84b89e-515f-4595-badf-a13b1ce0342a" containerID="d6f328b5a6c1718f5bab6bf30e2e97c2d89f117bd4bd787c424b55bb82f11bab" exitCode=0 Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.142327 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" event={"ID":"da84b89e-515f-4595-badf-a13b1ce0342a","Type":"ContainerDied","Data":"d6f328b5a6c1718f5bab6bf30e2e97c2d89f117bd4bd787c424b55bb82f11bab"} Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.593968 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.690607 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-bound-sa-token\") pod \"da84b89e-515f-4595-badf-a13b1ce0342a\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.691048 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da84b89e-515f-4595-badf-a13b1ce0342a-installation-pull-secrets\") pod \"da84b89e-515f-4595-badf-a13b1ce0342a\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.691097 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzvnd\" (UniqueName: \"kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-kube-api-access-bzvnd\") pod \"da84b89e-515f-4595-badf-a13b1ce0342a\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.691123 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da84b89e-515f-4595-badf-a13b1ce0342a-registry-certificates\") pod \"da84b89e-515f-4595-badf-a13b1ce0342a\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.691143 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-registry-tls\") pod \"da84b89e-515f-4595-badf-a13b1ce0342a\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.691341 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"da84b89e-515f-4595-badf-a13b1ce0342a\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.691406 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da84b89e-515f-4595-badf-a13b1ce0342a-trusted-ca\") pod \"da84b89e-515f-4595-badf-a13b1ce0342a\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.691429 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da84b89e-515f-4595-badf-a13b1ce0342a-ca-trust-extracted\") pod \"da84b89e-515f-4595-badf-a13b1ce0342a\" (UID: \"da84b89e-515f-4595-badf-a13b1ce0342a\") " Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.693396 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da84b89e-515f-4595-badf-a13b1ce0342a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "da84b89e-515f-4595-badf-a13b1ce0342a" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.694229 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da84b89e-515f-4595-badf-a13b1ce0342a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "da84b89e-515f-4595-badf-a13b1ce0342a" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.698684 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "da84b89e-515f-4595-badf-a13b1ce0342a" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.699568 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da84b89e-515f-4595-badf-a13b1ce0342a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "da84b89e-515f-4595-badf-a13b1ce0342a" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.704120 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "da84b89e-515f-4595-badf-a13b1ce0342a" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.704290 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-kube-api-access-bzvnd" (OuterVolumeSpecName: "kube-api-access-bzvnd") pod "da84b89e-515f-4595-badf-a13b1ce0342a" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a"). InnerVolumeSpecName "kube-api-access-bzvnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.713212 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da84b89e-515f-4595-badf-a13b1ce0342a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "da84b89e-515f-4595-badf-a13b1ce0342a" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.741075 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "da84b89e-515f-4595-badf-a13b1ce0342a" (UID: "da84b89e-515f-4595-badf-a13b1ce0342a"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.792850 4781 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da84b89e-515f-4595-badf-a13b1ce0342a-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.792882 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzvnd\" (UniqueName: \"kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-kube-api-access-bzvnd\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.792895 4781 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da84b89e-515f-4595-badf-a13b1ce0342a-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.792906 4781 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.792916 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da84b89e-515f-4595-badf-a13b1ce0342a-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.792941 4781 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da84b89e-515f-4595-badf-a13b1ce0342a-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:10 crc kubenswrapper[4781]: I1202 09:28:10.792950 4781 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da84b89e-515f-4595-badf-a13b1ce0342a-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:11 crc kubenswrapper[4781]: I1202 09:28:11.150914 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" event={"ID":"da84b89e-515f-4595-badf-a13b1ce0342a","Type":"ContainerDied","Data":"1804438e85599d0f35e265acaf4154791225608413d166e49e1e7fc042354ffb"} Dec 02 09:28:11 crc kubenswrapper[4781]: I1202 09:28:11.150989 4781 scope.go:117] "RemoveContainer" containerID="d6f328b5a6c1718f5bab6bf30e2e97c2d89f117bd4bd787c424b55bb82f11bab" Dec 02 09:28:11 crc kubenswrapper[4781]: I1202 09:28:11.151033 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-s95hx" Dec 02 09:28:11 crc kubenswrapper[4781]: I1202 09:28:11.181582 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s95hx"] Dec 02 09:28:11 crc kubenswrapper[4781]: I1202 09:28:11.184831 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s95hx"] Dec 02 09:28:11 crc kubenswrapper[4781]: I1202 09:28:11.507571 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da84b89e-515f-4595-badf-a13b1ce0342a" path="/var/lib/kubelet/pods/da84b89e-515f-4595-badf-a13b1ce0342a/volumes" Dec 02 09:28:19 crc kubenswrapper[4781]: I1202 09:28:19.190287 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbzzx" event={"ID":"2af2414f-0f9f-418b-b807-4362bf6ee700","Type":"ContainerStarted","Data":"75c16bb8b86c860ba3a6dbda2c1f884b5dae83d5fcd264536c938155aa61b7a8"} Dec 02 09:28:19 crc kubenswrapper[4781]: I1202 09:28:19.193536 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6s4mn" event={"ID":"1996ded9-7aca-4ca6-b787-1c688c678893","Type":"ContainerStarted","Data":"3d474253783983f51a6c72cc9b5e7cf2487fab7e852b96479e2398f7dd21c686"} Dec 02 09:28:19 crc kubenswrapper[4781]: I1202 09:28:19.195201 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzs7n" event={"ID":"14c3d9a9-9948-4be9-86a3-9def6cce4450","Type":"ContainerStarted","Data":"1313486f231018f4faf9b7694fc510af10d5ec91241410920d752ef8a2080ca8"} Dec 02 09:28:19 crc kubenswrapper[4781]: I1202 09:28:19.197156 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2wcs" event={"ID":"5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4","Type":"ContainerStarted","Data":"30dae20a8b61db8b8666a050695a4221aa0af06abcc57fd44e7700c078d2c866"} Dec 02 09:28:19 crc kubenswrapper[4781]: I1202 09:28:19.213179 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6s4mn" podStartSLOduration=3.446537733 podStartE2EDuration="15.21316189s" podCreationTimestamp="2025-12-02 09:28:04 +0000 UTC" firstStartedPulling="2025-12-02 09:28:06.110366635 +0000 UTC m=+448.934240514" lastFinishedPulling="2025-12-02 09:28:17.876990792 +0000 UTC m=+460.700864671" observedRunningTime="2025-12-02 09:28:19.210125517 +0000 UTC m=+462.033999396" watchObservedRunningTime="2025-12-02 09:28:19.21316189 +0000 UTC m=+462.037035769" Dec 02 09:28:20 crc kubenswrapper[4781]: I1202 09:28:20.204598 4781 generic.go:334] "Generic (PLEG): container finished" podID="5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4" containerID="30dae20a8b61db8b8666a050695a4221aa0af06abcc57fd44e7700c078d2c866" exitCode=0 Dec 02 09:28:20 crc kubenswrapper[4781]: I1202 09:28:20.204723 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2wcs" event={"ID":"5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4","Type":"ContainerDied","Data":"30dae20a8b61db8b8666a050695a4221aa0af06abcc57fd44e7700c078d2c866"} Dec 02 09:28:20 crc kubenswrapper[4781]: I1202 09:28:20.207533 4781 generic.go:334] "Generic (PLEG): container finished" podID="2af2414f-0f9f-418b-b807-4362bf6ee700" containerID="75c16bb8b86c860ba3a6dbda2c1f884b5dae83d5fcd264536c938155aa61b7a8" exitCode=0 Dec 02 09:28:20 crc kubenswrapper[4781]: I1202 09:28:20.207583 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbzzx" event={"ID":"2af2414f-0f9f-418b-b807-4362bf6ee700","Type":"ContainerDied","Data":"75c16bb8b86c860ba3a6dbda2c1f884b5dae83d5fcd264536c938155aa61b7a8"} Dec 02 09:28:20 crc kubenswrapper[4781]: I1202 09:28:20.214352 4781 generic.go:334] "Generic (PLEG): container finished" podID="14c3d9a9-9948-4be9-86a3-9def6cce4450" containerID="1313486f231018f4faf9b7694fc510af10d5ec91241410920d752ef8a2080ca8" exitCode=0 Dec 02 09:28:20 crc kubenswrapper[4781]: I1202 09:28:20.214443 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzs7n" event={"ID":"14c3d9a9-9948-4be9-86a3-9def6cce4450","Type":"ContainerDied","Data":"1313486f231018f4faf9b7694fc510af10d5ec91241410920d752ef8a2080ca8"} Dec 02 09:28:22 crc kubenswrapper[4781]: I1202 09:28:22.227259 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2wcs" event={"ID":"5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4","Type":"ContainerStarted","Data":"7c1e445ae3d5282ffadf42bac546469e6721abbe29ec0d5514785242d655fea5"} Dec 02 09:28:22 crc kubenswrapper[4781]: I1202 09:28:22.229967 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbzzx" event={"ID":"2af2414f-0f9f-418b-b807-4362bf6ee700","Type":"ContainerStarted","Data":"86249df209579fc2a9dd3fe2274a019994e5f5df37fcb3429112b3fca07bd1b9"} Dec 02 09:28:22 crc kubenswrapper[4781]: I1202 09:28:22.236079 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzs7n" event={"ID":"14c3d9a9-9948-4be9-86a3-9def6cce4450","Type":"ContainerStarted","Data":"08d24f041d37fcd264fe63d3cb0ac6900cd974e11d0073e7bf738cb6b715ce59"} Dec 02 09:28:22 crc kubenswrapper[4781]: I1202 09:28:22.251977 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w2wcs" podStartSLOduration=3.147870576 podStartE2EDuration="18.25195906s" podCreationTimestamp="2025-12-02 09:28:04 +0000 UTC" firstStartedPulling="2025-12-02 09:28:06.10757133 +0000 UTC m=+448.931445209" lastFinishedPulling="2025-12-02 09:28:21.211659814 +0000 UTC m=+464.035533693" observedRunningTime="2025-12-02 09:28:22.248347782 +0000 UTC m=+465.072221661" watchObservedRunningTime="2025-12-02 09:28:22.25195906 +0000 UTC m=+465.075832939" Dec 02 09:28:22 crc kubenswrapper[4781]: I1202 09:28:22.271398 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tzs7n" podStartSLOduration=2.784952313 podStartE2EDuration="16.271379154s" podCreationTimestamp="2025-12-02 09:28:06 +0000 UTC" firstStartedPulling="2025-12-02 09:28:08.126064542 +0000 UTC m=+450.949938421" lastFinishedPulling="2025-12-02 09:28:21.612491383 +0000 UTC m=+464.436365262" observedRunningTime="2025-12-02 09:28:22.269005279 +0000 UTC m=+465.092879168" watchObservedRunningTime="2025-12-02 09:28:22.271379154 +0000 UTC m=+465.095253033" Dec 02 09:28:22 crc kubenswrapper[4781]: I1202 09:28:22.286523 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kbzzx" podStartSLOduration=2.514896482 podStartE2EDuration="16.28650418s" podCreationTimestamp="2025-12-02 09:28:06 +0000 UTC" firstStartedPulling="2025-12-02 09:28:08.122472705 +0000 UTC m=+450.946346584" lastFinishedPulling="2025-12-02 09:28:21.894080403 +0000 UTC m=+464.717954282" observedRunningTime="2025-12-02 09:28:22.284360361 +0000 UTC m=+465.108234250" watchObservedRunningTime="2025-12-02 09:28:22.28650418 +0000 UTC m=+465.110378059" Dec 02 09:28:24 crc kubenswrapper[4781]: I1202 09:28:24.485090 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6s4mn" Dec 02 09:28:24 crc kubenswrapper[4781]: I1202 09:28:24.485438 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6s4mn" Dec 02 09:28:24 crc kubenswrapper[4781]: I1202 09:28:24.522460 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6s4mn" Dec 02 09:28:24 crc kubenswrapper[4781]: I1202 09:28:24.675881 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w2wcs" Dec 02 09:28:24 crc kubenswrapper[4781]: I1202 09:28:24.675963 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w2wcs" Dec 02 09:28:25 crc kubenswrapper[4781]: I1202 09:28:25.286269 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6s4mn" Dec 02 09:28:25 crc kubenswrapper[4781]: I1202 09:28:25.714715 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w2wcs" podUID="5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4" containerName="registry-server" probeResult="failure" output=< Dec 02 09:28:25 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Dec 02 09:28:25 crc kubenswrapper[4781]: > Dec 02 09:28:26 crc kubenswrapper[4781]: I1202 09:28:26.848014 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tzs7n" Dec 02 09:28:26 crc kubenswrapper[4781]: I1202 09:28:26.848076 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tzs7n" Dec 02 09:28:26 crc kubenswrapper[4781]: I1202 09:28:26.881969 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tzs7n" Dec 02 09:28:27 crc kubenswrapper[4781]: I1202 09:28:27.041229 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kbzzx" Dec 02 09:28:27 crc kubenswrapper[4781]: I1202 09:28:27.041284 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kbzzx" Dec 02 09:28:27 crc kubenswrapper[4781]: I1202 09:28:27.082098 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kbzzx" Dec 02 09:28:27 crc kubenswrapper[4781]: I1202 09:28:27.306218 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tzs7n" Dec 02 09:28:27 crc kubenswrapper[4781]: I1202 09:28:27.323765 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kbzzx" Dec 02 09:28:34 crc kubenswrapper[4781]: I1202 09:28:34.720784 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w2wcs" Dec 02 09:28:34 crc kubenswrapper[4781]: I1202 09:28:34.786526 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w2wcs" Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.152015 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5"] Dec 02 09:30:00 crc kubenswrapper[4781]: E1202 09:30:00.152887 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da84b89e-515f-4595-badf-a13b1ce0342a" containerName="registry" Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.152905 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="da84b89e-515f-4595-badf-a13b1ce0342a" containerName="registry" Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.153008 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="da84b89e-515f-4595-badf-a13b1ce0342a" containerName="registry" Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.154350 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5" Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.157342 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.158437 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5"] Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.158846 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.284212 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bef575f2-653f-42b4-a40a-33610a164402-config-volume\") pod \"collect-profiles-29411130-s9jw5\" (UID: \"bef575f2-653f-42b4-a40a-33610a164402\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5" Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.284266 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bef575f2-653f-42b4-a40a-33610a164402-secret-volume\") pod \"collect-profiles-29411130-s9jw5\" (UID: \"bef575f2-653f-42b4-a40a-33610a164402\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5" Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.284328 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppnns\" (UniqueName: \"kubernetes.io/projected/bef575f2-653f-42b4-a40a-33610a164402-kube-api-access-ppnns\") pod \"collect-profiles-29411130-s9jw5\" (UID: \"bef575f2-653f-42b4-a40a-33610a164402\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5" Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.385508 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppnns\" (UniqueName: \"kubernetes.io/projected/bef575f2-653f-42b4-a40a-33610a164402-kube-api-access-ppnns\") pod \"collect-profiles-29411130-s9jw5\" (UID: \"bef575f2-653f-42b4-a40a-33610a164402\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5" Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.385576 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bef575f2-653f-42b4-a40a-33610a164402-config-volume\") pod \"collect-profiles-29411130-s9jw5\" (UID: \"bef575f2-653f-42b4-a40a-33610a164402\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5" Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.385594 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bef575f2-653f-42b4-a40a-33610a164402-secret-volume\") pod \"collect-profiles-29411130-s9jw5\" (UID: \"bef575f2-653f-42b4-a40a-33610a164402\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5" Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.386611 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bef575f2-653f-42b4-a40a-33610a164402-config-volume\") pod \"collect-profiles-29411130-s9jw5\" (UID: \"bef575f2-653f-42b4-a40a-33610a164402\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5" Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.392163 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bef575f2-653f-42b4-a40a-33610a164402-secret-volume\") pod \"collect-profiles-29411130-s9jw5\" (UID: \"bef575f2-653f-42b4-a40a-33610a164402\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5" Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.403224 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppnns\" (UniqueName: \"kubernetes.io/projected/bef575f2-653f-42b4-a40a-33610a164402-kube-api-access-ppnns\") pod \"collect-profiles-29411130-s9jw5\" (UID: \"bef575f2-653f-42b4-a40a-33610a164402\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5" Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.488038 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5" Dec 02 09:30:00 crc kubenswrapper[4781]: I1202 09:30:00.873091 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5"] Dec 02 09:30:01 crc kubenswrapper[4781]: I1202 09:30:01.437818 4781 generic.go:334] "Generic (PLEG): container finished" podID="bef575f2-653f-42b4-a40a-33610a164402" containerID="ee52b9632095920fd4cb794cb36f983e8884d348294428288218946fddf7fe31" exitCode=0 Dec 02 09:30:01 crc kubenswrapper[4781]: I1202 09:30:01.437882 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5" event={"ID":"bef575f2-653f-42b4-a40a-33610a164402","Type":"ContainerDied","Data":"ee52b9632095920fd4cb794cb36f983e8884d348294428288218946fddf7fe31"} Dec 02 09:30:01 crc kubenswrapper[4781]: I1202 09:30:01.438204 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5" event={"ID":"bef575f2-653f-42b4-a40a-33610a164402","Type":"ContainerStarted","Data":"21f7f3d4e2f3826038200bef3df7374f8029a6a4c22d1251ae7efffa7b705c51"} Dec 02 09:30:02 crc kubenswrapper[4781]: I1202 09:30:02.659767 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5" Dec 02 09:30:02 crc kubenswrapper[4781]: I1202 09:30:02.813913 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppnns\" (UniqueName: \"kubernetes.io/projected/bef575f2-653f-42b4-a40a-33610a164402-kube-api-access-ppnns\") pod \"bef575f2-653f-42b4-a40a-33610a164402\" (UID: \"bef575f2-653f-42b4-a40a-33610a164402\") " Dec 02 09:30:02 crc kubenswrapper[4781]: I1202 09:30:02.814105 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bef575f2-653f-42b4-a40a-33610a164402-secret-volume\") pod \"bef575f2-653f-42b4-a40a-33610a164402\" (UID: \"bef575f2-653f-42b4-a40a-33610a164402\") " Dec 02 09:30:02 crc kubenswrapper[4781]: I1202 09:30:02.814139 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bef575f2-653f-42b4-a40a-33610a164402-config-volume\") pod \"bef575f2-653f-42b4-a40a-33610a164402\" (UID: \"bef575f2-653f-42b4-a40a-33610a164402\") " Dec 02 09:30:02 crc kubenswrapper[4781]: I1202 09:30:02.814787 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bef575f2-653f-42b4-a40a-33610a164402-config-volume" (OuterVolumeSpecName: "config-volume") pod "bef575f2-653f-42b4-a40a-33610a164402" (UID: "bef575f2-653f-42b4-a40a-33610a164402"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:30:02 crc kubenswrapper[4781]: I1202 09:30:02.819045 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef575f2-653f-42b4-a40a-33610a164402-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bef575f2-653f-42b4-a40a-33610a164402" (UID: "bef575f2-653f-42b4-a40a-33610a164402"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:30:02 crc kubenswrapper[4781]: I1202 09:30:02.819088 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bef575f2-653f-42b4-a40a-33610a164402-kube-api-access-ppnns" (OuterVolumeSpecName: "kube-api-access-ppnns") pod "bef575f2-653f-42b4-a40a-33610a164402" (UID: "bef575f2-653f-42b4-a40a-33610a164402"). InnerVolumeSpecName "kube-api-access-ppnns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:30:02 crc kubenswrapper[4781]: I1202 09:30:02.915633 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bef575f2-653f-42b4-a40a-33610a164402-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 09:30:02 crc kubenswrapper[4781]: I1202 09:30:02.915671 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bef575f2-653f-42b4-a40a-33610a164402-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 09:30:02 crc kubenswrapper[4781]: I1202 09:30:02.915681 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppnns\" (UniqueName: \"kubernetes.io/projected/bef575f2-653f-42b4-a40a-33610a164402-kube-api-access-ppnns\") on node \"crc\" DevicePath \"\"" Dec 02 09:30:03 crc kubenswrapper[4781]: I1202 09:30:03.451150 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5" event={"ID":"bef575f2-653f-42b4-a40a-33610a164402","Type":"ContainerDied","Data":"21f7f3d4e2f3826038200bef3df7374f8029a6a4c22d1251ae7efffa7b705c51"} Dec 02 09:30:03 crc kubenswrapper[4781]: I1202 09:30:03.451191 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21f7f3d4e2f3826038200bef3df7374f8029a6a4c22d1251ae7efffa7b705c51" Dec 02 09:30:03 crc kubenswrapper[4781]: I1202 09:30:03.451226 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5" Dec 02 09:30:30 crc kubenswrapper[4781]: I1202 09:30:30.412824 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:30:30 crc kubenswrapper[4781]: I1202 09:30:30.413554 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:31:00 crc kubenswrapper[4781]: I1202 09:31:00.412356 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:31:00 crc kubenswrapper[4781]: I1202 09:31:00.413174 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:31:30 crc kubenswrapper[4781]: I1202 09:31:30.412408 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:31:30 crc kubenswrapper[4781]: I1202 09:31:30.413037 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:31:30 crc kubenswrapper[4781]: I1202 09:31:30.413123 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:31:30 crc kubenswrapper[4781]: I1202 09:31:30.413709 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b4ee060e26e3e3e4831e7dd0ea5a0ab99f2012d5edf1e6348cbe970647009c50"} pod="openshift-machine-config-operator/machine-config-daemon-pzntm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:31:30 crc kubenswrapper[4781]: I1202 09:31:30.413763 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" containerID="cri-o://b4ee060e26e3e3e4831e7dd0ea5a0ab99f2012d5edf1e6348cbe970647009c50" gracePeriod=600 Dec 02 09:31:30 crc kubenswrapper[4781]: I1202 09:31:30.911025 4781 generic.go:334] "Generic (PLEG): container finished" podID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerID="b4ee060e26e3e3e4831e7dd0ea5a0ab99f2012d5edf1e6348cbe970647009c50" exitCode=0 Dec 02 09:31:30 crc kubenswrapper[4781]: I1202 09:31:30.911107 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerDied","Data":"b4ee060e26e3e3e4831e7dd0ea5a0ab99f2012d5edf1e6348cbe970647009c50"} Dec 02 09:31:30 crc kubenswrapper[4781]: I1202 09:31:30.911373 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"a295986289f2c69b34cac9164fd7741bae073fa0ffe0145fa34c6c835c2b0b11"} Dec 02 09:31:30 crc kubenswrapper[4781]: I1202 09:31:30.911399 4781 scope.go:117] "RemoveContainer" containerID="ebcdebd6c3481dcb7b7c492991920a861406e047665bbf99c0df4f3ff4534e5b" Dec 02 09:33:11 crc kubenswrapper[4781]: I1202 09:33:11.026981 4781 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 09:33:30 crc kubenswrapper[4781]: I1202 09:33:30.412348 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:33:30 crc kubenswrapper[4781]: I1202 09:33:30.412886 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:34:00 crc kubenswrapper[4781]: I1202 09:34:00.411765 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:34:00 crc kubenswrapper[4781]: I1202 09:34:00.412282 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:34:12 crc kubenswrapper[4781]: I1202 09:34:12.948978 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m2dcv"] Dec 02 09:34:12 crc kubenswrapper[4781]: E1202 09:34:12.949981 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef575f2-653f-42b4-a40a-33610a164402" containerName="collect-profiles" Dec 02 09:34:12 crc kubenswrapper[4781]: I1202 09:34:12.949998 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef575f2-653f-42b4-a40a-33610a164402" containerName="collect-profiles" Dec 02 09:34:12 crc kubenswrapper[4781]: I1202 09:34:12.950239 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef575f2-653f-42b4-a40a-33610a164402" containerName="collect-profiles" Dec 02 09:34:12 crc kubenswrapper[4781]: I1202 09:34:12.951357 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2dcv" Dec 02 09:34:12 crc kubenswrapper[4781]: I1202 09:34:12.960647 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m2dcv"] Dec 02 09:34:13 crc kubenswrapper[4781]: I1202 09:34:13.053710 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb87bc5-e50a-4796-b230-d44366b0db4c-catalog-content\") pod \"community-operators-m2dcv\" (UID: \"cfb87bc5-e50a-4796-b230-d44366b0db4c\") " pod="openshift-marketplace/community-operators-m2dcv" Dec 02 09:34:13 crc kubenswrapper[4781]: I1202 09:34:13.053796 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzzhv\" (UniqueName: \"kubernetes.io/projected/cfb87bc5-e50a-4796-b230-d44366b0db4c-kube-api-access-wzzhv\") pod \"community-operators-m2dcv\" (UID: \"cfb87bc5-e50a-4796-b230-d44366b0db4c\") " pod="openshift-marketplace/community-operators-m2dcv" Dec 02 09:34:13 crc kubenswrapper[4781]: I1202 09:34:13.053900 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb87bc5-e50a-4796-b230-d44366b0db4c-utilities\") pod \"community-operators-m2dcv\" (UID: \"cfb87bc5-e50a-4796-b230-d44366b0db4c\") " pod="openshift-marketplace/community-operators-m2dcv" Dec 02 09:34:13 crc kubenswrapper[4781]: I1202 09:34:13.155505 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb87bc5-e50a-4796-b230-d44366b0db4c-catalog-content\") pod \"community-operators-m2dcv\" (UID: \"cfb87bc5-e50a-4796-b230-d44366b0db4c\") " pod="openshift-marketplace/community-operators-m2dcv" Dec 02 09:34:13 crc kubenswrapper[4781]: I1202 09:34:13.155904 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzzhv\" (UniqueName: \"kubernetes.io/projected/cfb87bc5-e50a-4796-b230-d44366b0db4c-kube-api-access-wzzhv\") pod \"community-operators-m2dcv\" (UID: \"cfb87bc5-e50a-4796-b230-d44366b0db4c\") " pod="openshift-marketplace/community-operators-m2dcv" Dec 02 09:34:13 crc kubenswrapper[4781]: I1202 09:34:13.156006 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb87bc5-e50a-4796-b230-d44366b0db4c-utilities\") pod \"community-operators-m2dcv\" (UID: \"cfb87bc5-e50a-4796-b230-d44366b0db4c\") " pod="openshift-marketplace/community-operators-m2dcv" Dec 02 09:34:13 crc kubenswrapper[4781]: I1202 09:34:13.156098 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb87bc5-e50a-4796-b230-d44366b0db4c-catalog-content\") pod \"community-operators-m2dcv\" (UID: \"cfb87bc5-e50a-4796-b230-d44366b0db4c\") " pod="openshift-marketplace/community-operators-m2dcv" Dec 02 09:34:13 crc kubenswrapper[4781]: I1202 09:34:13.156603 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb87bc5-e50a-4796-b230-d44366b0db4c-utilities\") pod \"community-operators-m2dcv\" (UID: \"cfb87bc5-e50a-4796-b230-d44366b0db4c\") " pod="openshift-marketplace/community-operators-m2dcv" Dec 02 09:34:13 crc kubenswrapper[4781]: I1202 09:34:13.177356 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzzhv\" (UniqueName: \"kubernetes.io/projected/cfb87bc5-e50a-4796-b230-d44366b0db4c-kube-api-access-wzzhv\") pod \"community-operators-m2dcv\" (UID: \"cfb87bc5-e50a-4796-b230-d44366b0db4c\") " pod="openshift-marketplace/community-operators-m2dcv" Dec 02 09:34:13 crc kubenswrapper[4781]: I1202 09:34:13.267792 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2dcv" Dec 02 09:34:13 crc kubenswrapper[4781]: I1202 09:34:13.719718 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m2dcv"] Dec 02 09:34:14 crc kubenswrapper[4781]: I1202 09:34:14.425150 4781 generic.go:334] "Generic (PLEG): container finished" podID="cfb87bc5-e50a-4796-b230-d44366b0db4c" containerID="0eb10cec395fc8b6b902f2cbc36dec53eac483080427c6ed8dcb091eddb70e2f" exitCode=0 Dec 02 09:34:14 crc kubenswrapper[4781]: I1202 09:34:14.425193 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2dcv" event={"ID":"cfb87bc5-e50a-4796-b230-d44366b0db4c","Type":"ContainerDied","Data":"0eb10cec395fc8b6b902f2cbc36dec53eac483080427c6ed8dcb091eddb70e2f"} Dec 02 09:34:14 crc kubenswrapper[4781]: I1202 09:34:14.425215 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2dcv" event={"ID":"cfb87bc5-e50a-4796-b230-d44366b0db4c","Type":"ContainerStarted","Data":"ee66f5bd3646d0c16f021dca7dfc427e1c733e291ca001054557aea62c32c01c"} Dec 02 09:34:14 crc kubenswrapper[4781]: I1202 09:34:14.427992 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 09:34:16 crc kubenswrapper[4781]: I1202 09:34:16.437684 4781 generic.go:334] "Generic (PLEG): container finished" podID="cfb87bc5-e50a-4796-b230-d44366b0db4c" containerID="5bcdfcb2e3f4d4f6e13aba3e6ed21c4ecef7c6dc06c303061c35a023218f419e" exitCode=0 Dec 02 09:34:16 crc kubenswrapper[4781]: I1202 09:34:16.437753 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2dcv" event={"ID":"cfb87bc5-e50a-4796-b230-d44366b0db4c","Type":"ContainerDied","Data":"5bcdfcb2e3f4d4f6e13aba3e6ed21c4ecef7c6dc06c303061c35a023218f419e"} Dec 02 09:34:17 crc kubenswrapper[4781]: I1202 09:34:17.448001 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2dcv" event={"ID":"cfb87bc5-e50a-4796-b230-d44366b0db4c","Type":"ContainerStarted","Data":"2d0845813d529a29014bef1d06889f365f9cde489f2f0224d26f94882da02d24"} Dec 02 09:34:23 crc kubenswrapper[4781]: I1202 09:34:23.268706 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m2dcv" Dec 02 09:34:23 crc kubenswrapper[4781]: I1202 09:34:23.268767 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m2dcv" Dec 02 09:34:23 crc kubenswrapper[4781]: I1202 09:34:23.314800 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m2dcv" Dec 02 09:34:23 crc kubenswrapper[4781]: I1202 09:34:23.329809 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m2dcv" podStartSLOduration=8.81388928 podStartE2EDuration="11.329778631s" podCreationTimestamp="2025-12-02 09:34:12 +0000 UTC" firstStartedPulling="2025-12-02 09:34:14.427689213 +0000 UTC m=+817.251563102" lastFinishedPulling="2025-12-02 09:34:16.943578574 +0000 UTC m=+819.767452453" observedRunningTime="2025-12-02 09:34:17.466580773 +0000 UTC m=+820.290454652" watchObservedRunningTime="2025-12-02 09:34:23.329778631 +0000 UTC m=+826.153652550" Dec 02 09:34:23 crc kubenswrapper[4781]: I1202 09:34:23.529376 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m2dcv" Dec 02 09:34:23 crc kubenswrapper[4781]: I1202 09:34:23.578743 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m2dcv"] Dec 02 09:34:25 crc kubenswrapper[4781]: I1202 09:34:25.491387 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m2dcv" podUID="cfb87bc5-e50a-4796-b230-d44366b0db4c" containerName="registry-server" containerID="cri-o://2d0845813d529a29014bef1d06889f365f9cde489f2f0224d26f94882da02d24" gracePeriod=2 Dec 02 09:34:26 crc kubenswrapper[4781]: I1202 09:34:26.499321 4781 generic.go:334] "Generic (PLEG): container finished" podID="cfb87bc5-e50a-4796-b230-d44366b0db4c" containerID="2d0845813d529a29014bef1d06889f365f9cde489f2f0224d26f94882da02d24" exitCode=0 Dec 02 09:34:26 crc kubenswrapper[4781]: I1202 09:34:26.499412 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2dcv" event={"ID":"cfb87bc5-e50a-4796-b230-d44366b0db4c","Type":"ContainerDied","Data":"2d0845813d529a29014bef1d06889f365f9cde489f2f0224d26f94882da02d24"} Dec 02 09:34:26 crc kubenswrapper[4781]: I1202 09:34:26.939122 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2dcv" Dec 02 09:34:27 crc kubenswrapper[4781]: I1202 09:34:27.029136 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb87bc5-e50a-4796-b230-d44366b0db4c-utilities\") pod \"cfb87bc5-e50a-4796-b230-d44366b0db4c\" (UID: \"cfb87bc5-e50a-4796-b230-d44366b0db4c\") " Dec 02 09:34:27 crc kubenswrapper[4781]: I1202 09:34:27.029245 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzzhv\" (UniqueName: \"kubernetes.io/projected/cfb87bc5-e50a-4796-b230-d44366b0db4c-kube-api-access-wzzhv\") pod \"cfb87bc5-e50a-4796-b230-d44366b0db4c\" (UID: \"cfb87bc5-e50a-4796-b230-d44366b0db4c\") " Dec 02 09:34:27 crc kubenswrapper[4781]: I1202 09:34:27.029324 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb87bc5-e50a-4796-b230-d44366b0db4c-catalog-content\") pod \"cfb87bc5-e50a-4796-b230-d44366b0db4c\" (UID: \"cfb87bc5-e50a-4796-b230-d44366b0db4c\") " Dec 02 09:34:27 crc kubenswrapper[4781]: I1202 09:34:27.030340 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfb87bc5-e50a-4796-b230-d44366b0db4c-utilities" (OuterVolumeSpecName: "utilities") pod "cfb87bc5-e50a-4796-b230-d44366b0db4c" (UID: "cfb87bc5-e50a-4796-b230-d44366b0db4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:34:27 crc kubenswrapper[4781]: I1202 09:34:27.035829 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb87bc5-e50a-4796-b230-d44366b0db4c-kube-api-access-wzzhv" (OuterVolumeSpecName: "kube-api-access-wzzhv") pod "cfb87bc5-e50a-4796-b230-d44366b0db4c" (UID: "cfb87bc5-e50a-4796-b230-d44366b0db4c"). InnerVolumeSpecName "kube-api-access-wzzhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:34:27 crc kubenswrapper[4781]: I1202 09:34:27.085401 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfb87bc5-e50a-4796-b230-d44366b0db4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfb87bc5-e50a-4796-b230-d44366b0db4c" (UID: "cfb87bc5-e50a-4796-b230-d44366b0db4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:34:27 crc kubenswrapper[4781]: I1202 09:34:27.131446 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb87bc5-e50a-4796-b230-d44366b0db4c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:34:27 crc kubenswrapper[4781]: I1202 09:34:27.131489 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb87bc5-e50a-4796-b230-d44366b0db4c-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:34:27 crc kubenswrapper[4781]: I1202 09:34:27.131500 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzzhv\" (UniqueName: \"kubernetes.io/projected/cfb87bc5-e50a-4796-b230-d44366b0db4c-kube-api-access-wzzhv\") on node \"crc\" DevicePath \"\"" Dec 02 09:34:27 crc kubenswrapper[4781]: I1202 09:34:27.511504 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2dcv" event={"ID":"cfb87bc5-e50a-4796-b230-d44366b0db4c","Type":"ContainerDied","Data":"ee66f5bd3646d0c16f021dca7dfc427e1c733e291ca001054557aea62c32c01c"} Dec 02 09:34:27 crc kubenswrapper[4781]: I1202 09:34:27.511577 4781 scope.go:117] "RemoveContainer" containerID="2d0845813d529a29014bef1d06889f365f9cde489f2f0224d26f94882da02d24" Dec 02 09:34:27 crc kubenswrapper[4781]: I1202 09:34:27.511747 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2dcv" Dec 02 09:34:27 crc kubenswrapper[4781]: I1202 09:34:27.540669 4781 scope.go:117] "RemoveContainer" containerID="5bcdfcb2e3f4d4f6e13aba3e6ed21c4ecef7c6dc06c303061c35a023218f419e" Dec 02 09:34:27 crc kubenswrapper[4781]: I1202 09:34:27.569976 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m2dcv"] Dec 02 09:34:27 crc kubenswrapper[4781]: I1202 09:34:27.575117 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m2dcv"] Dec 02 09:34:27 crc kubenswrapper[4781]: I1202 09:34:27.583085 4781 scope.go:117] "RemoveContainer" containerID="0eb10cec395fc8b6b902f2cbc36dec53eac483080427c6ed8dcb091eddb70e2f" Dec 02 09:34:29 crc kubenswrapper[4781]: I1202 09:34:29.505567 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb87bc5-e50a-4796-b230-d44366b0db4c" path="/var/lib/kubelet/pods/cfb87bc5-e50a-4796-b230-d44366b0db4c/volumes" Dec 02 09:34:30 crc kubenswrapper[4781]: I1202 09:34:30.412989 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:34:30 crc kubenswrapper[4781]: I1202 09:34:30.413134 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:34:30 crc kubenswrapper[4781]: I1202 09:34:30.413231 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:34:30 crc kubenswrapper[4781]: I1202 09:34:30.414472 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a295986289f2c69b34cac9164fd7741bae073fa0ffe0145fa34c6c835c2b0b11"} pod="openshift-machine-config-operator/machine-config-daemon-pzntm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:34:30 crc kubenswrapper[4781]: I1202 09:34:30.414595 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" containerID="cri-o://a295986289f2c69b34cac9164fd7741bae073fa0ffe0145fa34c6c835c2b0b11" gracePeriod=600 Dec 02 09:34:31 crc kubenswrapper[4781]: I1202 09:34:31.543239 4781 generic.go:334] "Generic (PLEG): container finished" podID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerID="a295986289f2c69b34cac9164fd7741bae073fa0ffe0145fa34c6c835c2b0b11" exitCode=0 Dec 02 09:34:31 crc kubenswrapper[4781]: I1202 09:34:31.543277 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerDied","Data":"a295986289f2c69b34cac9164fd7741bae073fa0ffe0145fa34c6c835c2b0b11"} Dec 02 09:34:31 crc kubenswrapper[4781]: I1202 09:34:31.543662 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"4d8ba3ba707695a318e7a4765e850fbeffac37a0664ff39d2592ecb9000863ef"} Dec 02 09:34:31 crc kubenswrapper[4781]: I1202 09:34:31.543693 4781 scope.go:117] "RemoveContainer" containerID="b4ee060e26e3e3e4831e7dd0ea5a0ab99f2012d5edf1e6348cbe970647009c50" Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.129384 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vptq6"] Dec 02 09:34:34 crc kubenswrapper[4781]: E1202 09:34:34.129916 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb87bc5-e50a-4796-b230-d44366b0db4c" containerName="extract-utilities" Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.129949 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb87bc5-e50a-4796-b230-d44366b0db4c" containerName="extract-utilities" Dec 02 09:34:34 crc kubenswrapper[4781]: E1202 09:34:34.129963 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb87bc5-e50a-4796-b230-d44366b0db4c" containerName="registry-server" Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.129969 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb87bc5-e50a-4796-b230-d44366b0db4c" containerName="registry-server" Dec 02 09:34:34 crc kubenswrapper[4781]: E1202 09:34:34.129987 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb87bc5-e50a-4796-b230-d44366b0db4c" containerName="extract-content" Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.129994 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb87bc5-e50a-4796-b230-d44366b0db4c" containerName="extract-content" Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.130101 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb87bc5-e50a-4796-b230-d44366b0db4c" containerName="registry-server" Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.130828 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vptq6" Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.142559 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vptq6"] Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.222527 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22552058-79f8-4440-b70f-bae86e95c85d-catalog-content\") pod \"redhat-marketplace-vptq6\" (UID: \"22552058-79f8-4440-b70f-bae86e95c85d\") " pod="openshift-marketplace/redhat-marketplace-vptq6" Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.222611 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbvxz\" (UniqueName: \"kubernetes.io/projected/22552058-79f8-4440-b70f-bae86e95c85d-kube-api-access-vbvxz\") pod \"redhat-marketplace-vptq6\" (UID: \"22552058-79f8-4440-b70f-bae86e95c85d\") " pod="openshift-marketplace/redhat-marketplace-vptq6" Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.222646 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22552058-79f8-4440-b70f-bae86e95c85d-utilities\") pod \"redhat-marketplace-vptq6\" (UID: \"22552058-79f8-4440-b70f-bae86e95c85d\") " pod="openshift-marketplace/redhat-marketplace-vptq6" Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.323777 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22552058-79f8-4440-b70f-bae86e95c85d-catalog-content\") pod \"redhat-marketplace-vptq6\" (UID: \"22552058-79f8-4440-b70f-bae86e95c85d\") " pod="openshift-marketplace/redhat-marketplace-vptq6" Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.323823 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbvxz\" (UniqueName: \"kubernetes.io/projected/22552058-79f8-4440-b70f-bae86e95c85d-kube-api-access-vbvxz\") pod \"redhat-marketplace-vptq6\" (UID: \"22552058-79f8-4440-b70f-bae86e95c85d\") " pod="openshift-marketplace/redhat-marketplace-vptq6" Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.323842 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22552058-79f8-4440-b70f-bae86e95c85d-utilities\") pod \"redhat-marketplace-vptq6\" (UID: \"22552058-79f8-4440-b70f-bae86e95c85d\") " pod="openshift-marketplace/redhat-marketplace-vptq6" Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.324226 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22552058-79f8-4440-b70f-bae86e95c85d-utilities\") pod \"redhat-marketplace-vptq6\" (UID: \"22552058-79f8-4440-b70f-bae86e95c85d\") " pod="openshift-marketplace/redhat-marketplace-vptq6" Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.324250 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22552058-79f8-4440-b70f-bae86e95c85d-catalog-content\") pod \"redhat-marketplace-vptq6\" (UID: \"22552058-79f8-4440-b70f-bae86e95c85d\") " pod="openshift-marketplace/redhat-marketplace-vptq6" Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.343716 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbvxz\" (UniqueName: \"kubernetes.io/projected/22552058-79f8-4440-b70f-bae86e95c85d-kube-api-access-vbvxz\") pod \"redhat-marketplace-vptq6\" (UID: \"22552058-79f8-4440-b70f-bae86e95c85d\") " pod="openshift-marketplace/redhat-marketplace-vptq6" Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.459903 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vptq6" Dec 02 09:34:34 crc kubenswrapper[4781]: I1202 09:34:34.665991 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vptq6"] Dec 02 09:34:34 crc kubenswrapper[4781]: W1202 09:34:34.671062 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22552058_79f8_4440_b70f_bae86e95c85d.slice/crio-669d8cd3cb6645d8c997cc4adfdaeac889131c322e0419918fa620de7eb9ee2c WatchSource:0}: Error finding container 669d8cd3cb6645d8c997cc4adfdaeac889131c322e0419918fa620de7eb9ee2c: Status 404 returned error can't find the container with id 669d8cd3cb6645d8c997cc4adfdaeac889131c322e0419918fa620de7eb9ee2c Dec 02 09:34:35 crc kubenswrapper[4781]: I1202 09:34:35.573044 4781 generic.go:334] "Generic (PLEG): container finished" podID="22552058-79f8-4440-b70f-bae86e95c85d" containerID="f58bac62fff127518f50115fb952ce26a03e280fa072c2d696555f59997c502a" exitCode=0 Dec 02 09:34:35 crc kubenswrapper[4781]: I1202 09:34:35.573247 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vptq6" event={"ID":"22552058-79f8-4440-b70f-bae86e95c85d","Type":"ContainerDied","Data":"f58bac62fff127518f50115fb952ce26a03e280fa072c2d696555f59997c502a"} Dec 02 09:34:35 crc kubenswrapper[4781]: I1202 09:34:35.573463 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vptq6" event={"ID":"22552058-79f8-4440-b70f-bae86e95c85d","Type":"ContainerStarted","Data":"669d8cd3cb6645d8c997cc4adfdaeac889131c322e0419918fa620de7eb9ee2c"} Dec 02 09:34:37 crc kubenswrapper[4781]: I1202 09:34:37.584670 4781 generic.go:334] "Generic (PLEG): container finished" podID="22552058-79f8-4440-b70f-bae86e95c85d" containerID="688a76765cc9217ae56d7c0876fa83994ffee8e987ed6e22cb4f7afcc76f3b56" exitCode=0 Dec 02 09:34:37 crc kubenswrapper[4781]: I1202 09:34:37.584778 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vptq6" event={"ID":"22552058-79f8-4440-b70f-bae86e95c85d","Type":"ContainerDied","Data":"688a76765cc9217ae56d7c0876fa83994ffee8e987ed6e22cb4f7afcc76f3b56"} Dec 02 09:34:38 crc kubenswrapper[4781]: I1202 09:34:38.593706 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vptq6" event={"ID":"22552058-79f8-4440-b70f-bae86e95c85d","Type":"ContainerStarted","Data":"0c6f7f871c15c5700b330125c9bd7c9b10f1ca1675572b1fcccbe4273e2a7fe2"} Dec 02 09:34:38 crc kubenswrapper[4781]: I1202 09:34:38.614870 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vptq6" podStartSLOduration=1.8612812989999998 podStartE2EDuration="4.614847799s" podCreationTimestamp="2025-12-02 09:34:34 +0000 UTC" firstStartedPulling="2025-12-02 09:34:35.574918022 +0000 UTC m=+838.398791901" lastFinishedPulling="2025-12-02 09:34:38.328484532 +0000 UTC m=+841.152358401" observedRunningTime="2025-12-02 09:34:38.613621816 +0000 UTC m=+841.437495735" watchObservedRunningTime="2025-12-02 09:34:38.614847799 +0000 UTC m=+841.438721678" Dec 02 09:34:44 crc kubenswrapper[4781]: I1202 09:34:44.460912 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vptq6" Dec 02 09:34:44 crc kubenswrapper[4781]: I1202 09:34:44.460995 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vptq6" Dec 02 09:34:44 crc kubenswrapper[4781]: I1202 09:34:44.516950 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vptq6" Dec 02 09:34:44 crc kubenswrapper[4781]: I1202 09:34:44.671459 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vptq6" Dec 02 09:34:44 crc kubenswrapper[4781]: I1202 09:34:44.750999 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vptq6"] Dec 02 09:34:46 crc kubenswrapper[4781]: I1202 09:34:46.643440 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vptq6" podUID="22552058-79f8-4440-b70f-bae86e95c85d" containerName="registry-server" containerID="cri-o://0c6f7f871c15c5700b330125c9bd7c9b10f1ca1675572b1fcccbe4273e2a7fe2" gracePeriod=2 Dec 02 09:34:46 crc kubenswrapper[4781]: I1202 09:34:46.994716 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vptq6" Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.084994 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22552058-79f8-4440-b70f-bae86e95c85d-catalog-content\") pod \"22552058-79f8-4440-b70f-bae86e95c85d\" (UID: \"22552058-79f8-4440-b70f-bae86e95c85d\") " Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.085217 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22552058-79f8-4440-b70f-bae86e95c85d-utilities\") pod \"22552058-79f8-4440-b70f-bae86e95c85d\" (UID: \"22552058-79f8-4440-b70f-bae86e95c85d\") " Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.085335 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbvxz\" (UniqueName: \"kubernetes.io/projected/22552058-79f8-4440-b70f-bae86e95c85d-kube-api-access-vbvxz\") pod \"22552058-79f8-4440-b70f-bae86e95c85d\" (UID: \"22552058-79f8-4440-b70f-bae86e95c85d\") " Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.086131 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22552058-79f8-4440-b70f-bae86e95c85d-utilities" (OuterVolumeSpecName: "utilities") pod "22552058-79f8-4440-b70f-bae86e95c85d" (UID: "22552058-79f8-4440-b70f-bae86e95c85d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.090501 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22552058-79f8-4440-b70f-bae86e95c85d-kube-api-access-vbvxz" (OuterVolumeSpecName: "kube-api-access-vbvxz") pod "22552058-79f8-4440-b70f-bae86e95c85d" (UID: "22552058-79f8-4440-b70f-bae86e95c85d"). InnerVolumeSpecName "kube-api-access-vbvxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.109648 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22552058-79f8-4440-b70f-bae86e95c85d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22552058-79f8-4440-b70f-bae86e95c85d" (UID: "22552058-79f8-4440-b70f-bae86e95c85d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.186443 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22552058-79f8-4440-b70f-bae86e95c85d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.186488 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbvxz\" (UniqueName: \"kubernetes.io/projected/22552058-79f8-4440-b70f-bae86e95c85d-kube-api-access-vbvxz\") on node \"crc\" DevicePath \"\"" Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.186503 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22552058-79f8-4440-b70f-bae86e95c85d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.650158 4781 generic.go:334] "Generic (PLEG): container finished" podID="22552058-79f8-4440-b70f-bae86e95c85d" containerID="0c6f7f871c15c5700b330125c9bd7c9b10f1ca1675572b1fcccbe4273e2a7fe2" exitCode=0 Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.650200 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vptq6" event={"ID":"22552058-79f8-4440-b70f-bae86e95c85d","Type":"ContainerDied","Data":"0c6f7f871c15c5700b330125c9bd7c9b10f1ca1675572b1fcccbe4273e2a7fe2"} Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.650222 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vptq6" event={"ID":"22552058-79f8-4440-b70f-bae86e95c85d","Type":"ContainerDied","Data":"669d8cd3cb6645d8c997cc4adfdaeac889131c322e0419918fa620de7eb9ee2c"} Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.650230 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vptq6" Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.650240 4781 scope.go:117] "RemoveContainer" containerID="0c6f7f871c15c5700b330125c9bd7c9b10f1ca1675572b1fcccbe4273e2a7fe2" Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.672509 4781 scope.go:117] "RemoveContainer" containerID="688a76765cc9217ae56d7c0876fa83994ffee8e987ed6e22cb4f7afcc76f3b56" Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.680917 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vptq6"] Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.685379 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vptq6"] Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.690464 4781 scope.go:117] "RemoveContainer" containerID="f58bac62fff127518f50115fb952ce26a03e280fa072c2d696555f59997c502a" Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.711712 4781 scope.go:117] "RemoveContainer" containerID="0c6f7f871c15c5700b330125c9bd7c9b10f1ca1675572b1fcccbe4273e2a7fe2" Dec 02 09:34:47 crc kubenswrapper[4781]: E1202 09:34:47.712306 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c6f7f871c15c5700b330125c9bd7c9b10f1ca1675572b1fcccbe4273e2a7fe2\": container with ID starting with 0c6f7f871c15c5700b330125c9bd7c9b10f1ca1675572b1fcccbe4273e2a7fe2 not found: ID does not exist" containerID="0c6f7f871c15c5700b330125c9bd7c9b10f1ca1675572b1fcccbe4273e2a7fe2" Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.712346 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c6f7f871c15c5700b330125c9bd7c9b10f1ca1675572b1fcccbe4273e2a7fe2"} err="failed to get container status \"0c6f7f871c15c5700b330125c9bd7c9b10f1ca1675572b1fcccbe4273e2a7fe2\": rpc error: code = NotFound desc = could not find container \"0c6f7f871c15c5700b330125c9bd7c9b10f1ca1675572b1fcccbe4273e2a7fe2\": container with ID starting with 0c6f7f871c15c5700b330125c9bd7c9b10f1ca1675572b1fcccbe4273e2a7fe2 not found: ID does not exist" Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.712373 4781 scope.go:117] "RemoveContainer" containerID="688a76765cc9217ae56d7c0876fa83994ffee8e987ed6e22cb4f7afcc76f3b56" Dec 02 09:34:47 crc kubenswrapper[4781]: E1202 09:34:47.712700 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"688a76765cc9217ae56d7c0876fa83994ffee8e987ed6e22cb4f7afcc76f3b56\": container with ID starting with 688a76765cc9217ae56d7c0876fa83994ffee8e987ed6e22cb4f7afcc76f3b56 not found: ID does not exist" containerID="688a76765cc9217ae56d7c0876fa83994ffee8e987ed6e22cb4f7afcc76f3b56" Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.712730 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"688a76765cc9217ae56d7c0876fa83994ffee8e987ed6e22cb4f7afcc76f3b56"} err="failed to get container status \"688a76765cc9217ae56d7c0876fa83994ffee8e987ed6e22cb4f7afcc76f3b56\": rpc error: code = NotFound desc = could not find container \"688a76765cc9217ae56d7c0876fa83994ffee8e987ed6e22cb4f7afcc76f3b56\": container with ID starting with 688a76765cc9217ae56d7c0876fa83994ffee8e987ed6e22cb4f7afcc76f3b56 not found: ID does not exist" Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.712750 4781 scope.go:117] "RemoveContainer" containerID="f58bac62fff127518f50115fb952ce26a03e280fa072c2d696555f59997c502a" Dec 02 09:34:47 crc kubenswrapper[4781]: E1202 09:34:47.713158 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f58bac62fff127518f50115fb952ce26a03e280fa072c2d696555f59997c502a\": container with ID starting with f58bac62fff127518f50115fb952ce26a03e280fa072c2d696555f59997c502a not found: ID does not exist" containerID="f58bac62fff127518f50115fb952ce26a03e280fa072c2d696555f59997c502a" Dec 02 09:34:47 crc kubenswrapper[4781]: I1202 09:34:47.713269 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f58bac62fff127518f50115fb952ce26a03e280fa072c2d696555f59997c502a"} err="failed to get container status \"f58bac62fff127518f50115fb952ce26a03e280fa072c2d696555f59997c502a\": rpc error: code = NotFound desc = could not find container \"f58bac62fff127518f50115fb952ce26a03e280fa072c2d696555f59997c502a\": container with ID starting with f58bac62fff127518f50115fb952ce26a03e280fa072c2d696555f59997c502a not found: ID does not exist" Dec 02 09:34:49 crc kubenswrapper[4781]: I1202 09:34:49.506554 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22552058-79f8-4440-b70f-bae86e95c85d" path="/var/lib/kubelet/pods/22552058-79f8-4440-b70f-bae86e95c85d/volumes" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.472108 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-5jxwz"] Dec 02 09:35:01 crc kubenswrapper[4781]: E1202 09:35:01.473151 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22552058-79f8-4440-b70f-bae86e95c85d" containerName="registry-server" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.473172 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="22552058-79f8-4440-b70f-bae86e95c85d" containerName="registry-server" Dec 02 09:35:01 crc kubenswrapper[4781]: E1202 09:35:01.473183 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22552058-79f8-4440-b70f-bae86e95c85d" containerName="extract-content" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.473191 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="22552058-79f8-4440-b70f-bae86e95c85d" containerName="extract-content" Dec 02 09:35:01 crc kubenswrapper[4781]: E1202 09:35:01.473212 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22552058-79f8-4440-b70f-bae86e95c85d" containerName="extract-utilities" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.473221 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="22552058-79f8-4440-b70f-bae86e95c85d" containerName="extract-utilities" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.473390 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="22552058-79f8-4440-b70f-bae86e95c85d" containerName="registry-server" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.475670 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-5jxwz" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.479527 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.479998 4781 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-z7t97" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.480330 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.482559 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-5jxwz"] Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.492266 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-nhlj8"] Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.493608 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-nhlj8" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.495455 4781 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-hcf8c" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.496995 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-zl2kl"] Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.497764 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-zl2kl" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.502207 4781 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-f9x6k" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.511837 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-zl2kl"] Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.516106 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-nhlj8"] Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.594442 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdm6s\" (UniqueName: \"kubernetes.io/projected/bbeb0421-437b-4e00-a072-ebebf3354bea-kube-api-access-cdm6s\") pod \"cert-manager-cainjector-7f985d654d-5jxwz\" (UID: \"bbeb0421-437b-4e00-a072-ebebf3354bea\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-5jxwz" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.594486 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tk42\" (UniqueName: \"kubernetes.io/projected/e57350a9-3e11-4df0-a108-a9d8d446c219-kube-api-access-2tk42\") pod \"cert-manager-5b446d88c5-nhlj8\" (UID: \"e57350a9-3e11-4df0-a108-a9d8d446c219\") " pod="cert-manager/cert-manager-5b446d88c5-nhlj8" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.594528 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mdzc\" (UniqueName: \"kubernetes.io/projected/753e9900-fde9-4486-9bc3-fce98f302367-kube-api-access-6mdzc\") pod \"cert-manager-webhook-5655c58dd6-zl2kl\" (UID: \"753e9900-fde9-4486-9bc3-fce98f302367\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-zl2kl" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.695851 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdm6s\" (UniqueName: \"kubernetes.io/projected/bbeb0421-437b-4e00-a072-ebebf3354bea-kube-api-access-cdm6s\") pod \"cert-manager-cainjector-7f985d654d-5jxwz\" (UID: \"bbeb0421-437b-4e00-a072-ebebf3354bea\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-5jxwz" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.695884 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tk42\" (UniqueName: \"kubernetes.io/projected/e57350a9-3e11-4df0-a108-a9d8d446c219-kube-api-access-2tk42\") pod \"cert-manager-5b446d88c5-nhlj8\" (UID: \"e57350a9-3e11-4df0-a108-a9d8d446c219\") " pod="cert-manager/cert-manager-5b446d88c5-nhlj8" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.695934 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mdzc\" (UniqueName: \"kubernetes.io/projected/753e9900-fde9-4486-9bc3-fce98f302367-kube-api-access-6mdzc\") pod \"cert-manager-webhook-5655c58dd6-zl2kl\" (UID: \"753e9900-fde9-4486-9bc3-fce98f302367\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-zl2kl" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.714341 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mdzc\" (UniqueName: \"kubernetes.io/projected/753e9900-fde9-4486-9bc3-fce98f302367-kube-api-access-6mdzc\") pod \"cert-manager-webhook-5655c58dd6-zl2kl\" (UID: \"753e9900-fde9-4486-9bc3-fce98f302367\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-zl2kl" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.714842 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdm6s\" (UniqueName: \"kubernetes.io/projected/bbeb0421-437b-4e00-a072-ebebf3354bea-kube-api-access-cdm6s\") pod \"cert-manager-cainjector-7f985d654d-5jxwz\" (UID: \"bbeb0421-437b-4e00-a072-ebebf3354bea\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-5jxwz" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.722642 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tk42\" (UniqueName: \"kubernetes.io/projected/e57350a9-3e11-4df0-a108-a9d8d446c219-kube-api-access-2tk42\") pod \"cert-manager-5b446d88c5-nhlj8\" (UID: \"e57350a9-3e11-4df0-a108-a9d8d446c219\") " pod="cert-manager/cert-manager-5b446d88c5-nhlj8" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.838274 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-5jxwz" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.847313 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-nhlj8" Dec 02 09:35:01 crc kubenswrapper[4781]: I1202 09:35:01.854316 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-zl2kl" Dec 02 09:35:02 crc kubenswrapper[4781]: I1202 09:35:02.057151 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-5jxwz"] Dec 02 09:35:02 crc kubenswrapper[4781]: I1202 09:35:02.319549 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-nhlj8"] Dec 02 09:35:02 crc kubenswrapper[4781]: W1202 09:35:02.327883 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode57350a9_3e11_4df0_a108_a9d8d446c219.slice/crio-3ab12b1add02dcf48d7dcf2e545f5abe332ea31947ec3a5ca8c82bb38d40fddf WatchSource:0}: Error finding container 3ab12b1add02dcf48d7dcf2e545f5abe332ea31947ec3a5ca8c82bb38d40fddf: Status 404 returned error can't find the container with id 3ab12b1add02dcf48d7dcf2e545f5abe332ea31947ec3a5ca8c82bb38d40fddf Dec 02 09:35:02 crc kubenswrapper[4781]: I1202 09:35:02.328432 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-zl2kl"] Dec 02 09:35:02 crc kubenswrapper[4781]: I1202 09:35:02.751173 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-5jxwz" event={"ID":"bbeb0421-437b-4e00-a072-ebebf3354bea","Type":"ContainerStarted","Data":"5d065adcf8641223859a355e44dd34e395277b059046ae4de3583b178a104cf1"} Dec 02 09:35:02 crc kubenswrapper[4781]: I1202 09:35:02.752483 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-zl2kl" event={"ID":"753e9900-fde9-4486-9bc3-fce98f302367","Type":"ContainerStarted","Data":"46129b4e5b17d233ee6d3965265e70636fa989873440a4b355f3fd839e0a2b52"} Dec 02 09:35:02 crc kubenswrapper[4781]: I1202 09:35:02.753584 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-nhlj8" event={"ID":"e57350a9-3e11-4df0-a108-a9d8d446c219","Type":"ContainerStarted","Data":"3ab12b1add02dcf48d7dcf2e545f5abe332ea31947ec3a5ca8c82bb38d40fddf"} Dec 02 09:35:05 crc kubenswrapper[4781]: I1202 09:35:05.788179 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-5jxwz" event={"ID":"bbeb0421-437b-4e00-a072-ebebf3354bea","Type":"ContainerStarted","Data":"2fd7ec1e1fedebffe57d3f4fff787cd0435d3a98f12ec6e6adeb16295238e57f"} Dec 02 09:35:05 crc kubenswrapper[4781]: I1202 09:35:05.793342 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-nhlj8" event={"ID":"e57350a9-3e11-4df0-a108-a9d8d446c219","Type":"ContainerStarted","Data":"09299e836f846b33392cd5fbc679814aa15ecac0e0ad1ce26be34dfec50bef5e"} Dec 02 09:35:05 crc kubenswrapper[4781]: I1202 09:35:05.801972 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-5jxwz" podStartSLOduration=1.8726165030000002 podStartE2EDuration="4.801955547s" podCreationTimestamp="2025-12-02 09:35:01 +0000 UTC" firstStartedPulling="2025-12-02 09:35:02.067934806 +0000 UTC m=+864.891808685" lastFinishedPulling="2025-12-02 09:35:04.99727385 +0000 UTC m=+867.821147729" observedRunningTime="2025-12-02 09:35:05.800025234 +0000 UTC m=+868.623899113" watchObservedRunningTime="2025-12-02 09:35:05.801955547 +0000 UTC m=+868.625829426" Dec 02 09:35:05 crc kubenswrapper[4781]: I1202 09:35:05.822571 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-nhlj8" podStartSLOduration=2.152876897 podStartE2EDuration="4.822553264s" podCreationTimestamp="2025-12-02 09:35:01 +0000 UTC" firstStartedPulling="2025-12-02 09:35:02.329776582 +0000 UTC m=+865.153650461" lastFinishedPulling="2025-12-02 09:35:04.999452949 +0000 UTC m=+867.823326828" observedRunningTime="2025-12-02 09:35:05.818557016 +0000 UTC m=+868.642430895" watchObservedRunningTime="2025-12-02 09:35:05.822553264 +0000 UTC m=+868.646427143" Dec 02 09:35:06 crc kubenswrapper[4781]: I1202 09:35:06.799329 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-zl2kl" event={"ID":"753e9900-fde9-4486-9bc3-fce98f302367","Type":"ContainerStarted","Data":"7b9062742e20c8dcda1c3d9cf1c3ce5653e791358288ffd05944ad393cc83361"} Dec 02 09:35:06 crc kubenswrapper[4781]: I1202 09:35:06.820902 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-zl2kl" podStartSLOduration=2.144724336 podStartE2EDuration="5.820882141s" podCreationTimestamp="2025-12-02 09:35:01 +0000 UTC" firstStartedPulling="2025-12-02 09:35:02.327940482 +0000 UTC m=+865.151814361" lastFinishedPulling="2025-12-02 09:35:06.004098287 +0000 UTC m=+868.827972166" observedRunningTime="2025-12-02 09:35:06.816648676 +0000 UTC m=+869.640522555" watchObservedRunningTime="2025-12-02 09:35:06.820882141 +0000 UTC m=+869.644756020" Dec 02 09:35:06 crc kubenswrapper[4781]: I1202 09:35:06.855918 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-zl2kl" Dec 02 09:35:11 crc kubenswrapper[4781]: I1202 09:35:11.857636 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-zl2kl" Dec 02 09:35:19 crc kubenswrapper[4781]: I1202 09:35:19.838550 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x5x7g"] Dec 02 09:35:19 crc kubenswrapper[4781]: I1202 09:35:19.839378 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovn-controller" containerID="cri-o://458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec" gracePeriod=30 Dec 02 09:35:19 crc kubenswrapper[4781]: I1202 09:35:19.839725 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="kube-rbac-proxy-node" containerID="cri-o://506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7" gracePeriod=30 Dec 02 09:35:19 crc kubenswrapper[4781]: I1202 09:35:19.839761 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovn-acl-logging" containerID="cri-o://865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91" gracePeriod=30 Dec 02 09:35:19 crc kubenswrapper[4781]: I1202 09:35:19.839892 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61" gracePeriod=30 Dec 02 09:35:19 crc kubenswrapper[4781]: I1202 09:35:19.839988 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="nbdb" containerID="cri-o://e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd" gracePeriod=30 Dec 02 09:35:19 crc kubenswrapper[4781]: I1202 09:35:19.840030 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="sbdb" containerID="cri-o://17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726" gracePeriod=30 Dec 02 09:35:19 crc kubenswrapper[4781]: I1202 09:35:19.840116 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="northd" containerID="cri-o://a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1" gracePeriod=30 Dec 02 09:35:19 crc kubenswrapper[4781]: I1202 09:35:19.891151 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovnkube-controller" containerID="cri-o://6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0" gracePeriod=30 Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.188776 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovnkube-controller/3.log" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.190815 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovn-acl-logging/0.log" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.191305 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovn-controller/0.log" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.191745 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248101 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hmqqc"] Dec 02 09:35:20 crc kubenswrapper[4781]: E1202 09:35:20.248284 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovnkube-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248295 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovnkube-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: E1202 09:35:20.248302 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovn-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248308 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovn-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: E1202 09:35:20.248321 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovn-acl-logging" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248328 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovn-acl-logging" Dec 02 09:35:20 crc kubenswrapper[4781]: E1202 09:35:20.248337 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovnkube-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248343 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovnkube-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: E1202 09:35:20.248350 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="nbdb" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248355 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="nbdb" Dec 02 09:35:20 crc kubenswrapper[4781]: E1202 09:35:20.248361 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovnkube-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248367 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovnkube-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: E1202 09:35:20.248375 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="kube-rbac-proxy-node" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248380 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="kube-rbac-proxy-node" Dec 02 09:35:20 crc kubenswrapper[4781]: E1202 09:35:20.248389 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="kubecfg-setup" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248394 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="kubecfg-setup" Dec 02 09:35:20 crc kubenswrapper[4781]: E1202 09:35:20.248402 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="northd" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248407 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="northd" Dec 02 09:35:20 crc kubenswrapper[4781]: E1202 09:35:20.248415 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248420 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 09:35:20 crc kubenswrapper[4781]: E1202 09:35:20.248429 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="sbdb" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248435 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="sbdb" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248519 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovnkube-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248527 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovnkube-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248536 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="sbdb" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248545 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovnkube-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248552 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovn-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248562 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovnkube-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248568 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="northd" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248577 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="nbdb" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248586 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="kube-rbac-proxy-node" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248594 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248602 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovn-acl-logging" Dec 02 09:35:20 crc kubenswrapper[4781]: E1202 09:35:20.248688 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovnkube-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248695 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovnkube-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: E1202 09:35:20.248707 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovnkube-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248714 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovnkube-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.248792 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerName="ovnkube-controller" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.250628 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.316878 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-var-lib-openvswitch\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.317192 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-etc-openvswitch\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.317021 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.317217 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-systemd\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.317246 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.317348 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-openvswitch\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.317446 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.317445 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-cni-netd\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.317476 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-kubelet\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.317482 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.317514 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.317533 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-node-log\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.317572 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.317593 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.317632 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovn-node-metrics-cert\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.317673 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-node-log" (OuterVolumeSpecName: "node-log") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.317694 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovnkube-script-lib\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.317714 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxmlf\" (UniqueName: \"kubernetes.io/projected/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-kube-api-access-mxmlf\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318091 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-run-netns\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318121 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-run-ovn-kubernetes\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318142 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-ovn\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318160 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-systemd-units\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318179 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-log-socket\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318204 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-env-overrides\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318223 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovnkube-config\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318242 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-slash\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318269 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-cni-bin\") pod \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\" (UID: \"20ba2af9-1f67-4b6d-884a-666ef4f55bf3\") " Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318062 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318280 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318216 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318247 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318342 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-log-socket" (OuterVolumeSpecName: "log-socket") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318369 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318382 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-slash" (OuterVolumeSpecName: "host-slash") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318426 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318408 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-systemd-units\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318569 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318610 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-var-lib-openvswitch\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318628 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxh8b\" (UniqueName: \"kubernetes.io/projected/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-kube-api-access-qxh8b\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318655 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-cni-bin\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318685 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-run-ovn-kubernetes\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318718 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-env-overrides\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318746 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-run-ovn\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318782 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-run-openvswitch\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318808 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-log-socket\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318827 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-node-log\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318844 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-ovn-node-metrics-cert\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318798 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318819 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.318985 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-run-systemd\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319042 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-etc-openvswitch\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319078 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-slash\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319142 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-run-netns\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319166 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-cni-netd\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319210 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-kubelet\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319236 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-ovnkube-script-lib\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319261 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-ovnkube-config\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319319 4781 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319334 4781 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319349 4781 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319362 4781 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319373 4781 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319388 4781 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319401 4781 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-node-log\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319415 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319427 4781 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319437 4781 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319450 4781 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319460 4781 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319470 4781 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-log-socket\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319480 4781 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319491 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319501 4781 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-slash\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.319511 4781 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.322300 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.322370 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-kube-api-access-mxmlf" (OuterVolumeSpecName: "kube-api-access-mxmlf") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "kube-api-access-mxmlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.329550 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "20ba2af9-1f67-4b6d-884a-666ef4f55bf3" (UID: "20ba2af9-1f67-4b6d-884a-666ef4f55bf3"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420156 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-run-openvswitch\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420221 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-log-socket\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420246 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-node-log\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420271 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-ovn-node-metrics-cert\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420298 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-run-systemd\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420320 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-etc-openvswitch\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420340 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-slash\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420374 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-run-netns\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420365 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-node-log\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420428 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-cni-netd\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420396 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-cni-netd\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420466 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-run-systemd\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420294 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-run-openvswitch\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420499 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-kubelet\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420528 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-slash\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420534 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-ovnkube-script-lib\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420555 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-run-netns\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420504 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-etc-openvswitch\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420567 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-ovnkube-config\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420590 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-kubelet\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420600 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-systemd-units\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420645 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420690 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-var-lib-openvswitch\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420725 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxh8b\" (UniqueName: \"kubernetes.io/projected/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-kube-api-access-qxh8b\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420759 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-cni-bin\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420796 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-run-ovn-kubernetes\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420837 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-env-overrides\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420873 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-run-ovn\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420967 4781 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420990 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.421009 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxmlf\" (UniqueName: \"kubernetes.io/projected/20ba2af9-1f67-4b6d-884a-666ef4f55bf3-kube-api-access-mxmlf\") on node \"crc\" DevicePath \"\"" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.421056 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-run-ovn\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.421103 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.421146 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-var-lib-openvswitch\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.421566 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-cni-bin\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.421615 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-host-run-ovn-kubernetes\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.420794 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-systemd-units\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.421680 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-ovnkube-script-lib\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.421680 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-ovnkube-config\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.422341 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-env-overrides\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.422454 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-log-socket\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.425488 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-ovn-node-metrics-cert\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.441444 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxh8b\" (UniqueName: \"kubernetes.io/projected/4de58e6c-4cd7-4987-bdca-aa99fb83bf43-kube-api-access-qxh8b\") pod \"ovnkube-node-hmqqc\" (UID: \"4de58e6c-4cd7-4987-bdca-aa99fb83bf43\") " pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.568297 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.882675 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovnkube-controller/3.log" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.885018 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovn-acl-logging/0.log" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.885814 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5x7g_20ba2af9-1f67-4b6d-884a-666ef4f55bf3/ovn-controller/0.log" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886200 4781 generic.go:334] "Generic (PLEG): container finished" podID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerID="6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0" exitCode=0 Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886220 4781 generic.go:334] "Generic (PLEG): container finished" podID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerID="17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726" exitCode=0 Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886227 4781 generic.go:334] "Generic (PLEG): container finished" podID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerID="e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd" exitCode=0 Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886233 4781 generic.go:334] "Generic (PLEG): container finished" podID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerID="a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1" exitCode=0 Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886239 4781 generic.go:334] "Generic (PLEG): container finished" podID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerID="7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61" exitCode=0 Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886246 4781 generic.go:334] "Generic (PLEG): container finished" podID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerID="506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7" exitCode=0 Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886253 4781 generic.go:334] "Generic (PLEG): container finished" podID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerID="865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91" exitCode=143 Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886259 4781 generic.go:334] "Generic (PLEG): container finished" podID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" containerID="458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec" exitCode=143 Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886298 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerDied","Data":"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886323 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerDied","Data":"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886324 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886346 4781 scope.go:117] "RemoveContainer" containerID="6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886334 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerDied","Data":"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886485 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerDied","Data":"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886530 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerDied","Data":"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886559 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerDied","Data":"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886586 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886610 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886625 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886640 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886655 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886669 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886683 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886697 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886711 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886731 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerDied","Data":"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886756 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886773 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886788 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886802 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886817 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886833 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886849 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886864 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886878 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886896 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886917 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerDied","Data":"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886977 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.886997 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887013 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887027 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887041 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887055 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887069 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887083 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887097 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887111 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887135 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5x7g" event={"ID":"20ba2af9-1f67-4b6d-884a-666ef4f55bf3","Type":"ContainerDied","Data":"b8eacffb0f7f6a79c2f334742927c660a6811b6721ff20d186ed2a5024c1209c"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887161 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887181 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887197 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887212 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887227 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887241 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887256 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887270 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887284 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.887299 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.888027 4781 generic.go:334] "Generic (PLEG): container finished" podID="4de58e6c-4cd7-4987-bdca-aa99fb83bf43" containerID="ced801bed55e548a8b4ebe7cb431d77fca7f6a58ef7503f281478acf83c6eb87" exitCode=0 Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.888119 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" event={"ID":"4de58e6c-4cd7-4987-bdca-aa99fb83bf43","Type":"ContainerDied","Data":"ced801bed55e548a8b4ebe7cb431d77fca7f6a58ef7503f281478acf83c6eb87"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.888262 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" event={"ID":"4de58e6c-4cd7-4987-bdca-aa99fb83bf43","Type":"ContainerStarted","Data":"fd4e80cd4b4d80ece1c44133426f7a482df0b2edad4de16d92863b6be74ac189"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.891755 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8b6p8_d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650/kube-multus/2.log" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.894115 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8b6p8_d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650/kube-multus/1.log" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.894259 4781 generic.go:334] "Generic (PLEG): container finished" podID="d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650" containerID="dae5efab5f322eb124f4544f407cadfb40f0be369a438e63b0294e811913a024" exitCode=2 Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.894394 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8b6p8" event={"ID":"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650","Type":"ContainerDied","Data":"dae5efab5f322eb124f4544f407cadfb40f0be369a438e63b0294e811913a024"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.894512 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a029cea19e689697b3e833a0bd6c745475464429e227868f4cabed7c7fe3ec71"} Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.895631 4781 scope.go:117] "RemoveContainer" containerID="dae5efab5f322eb124f4544f407cadfb40f0be369a438e63b0294e811913a024" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.915774 4781 scope.go:117] "RemoveContainer" containerID="355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.941242 4781 scope.go:117] "RemoveContainer" containerID="17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.971982 4781 scope.go:117] "RemoveContainer" containerID="e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd" Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.982144 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x5x7g"] Dec 02 09:35:20 crc kubenswrapper[4781]: I1202 09:35:20.989336 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x5x7g"] Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.022882 4781 scope.go:117] "RemoveContainer" containerID="a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.051788 4781 scope.go:117] "RemoveContainer" containerID="7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.063739 4781 scope.go:117] "RemoveContainer" containerID="506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.075301 4781 scope.go:117] "RemoveContainer" containerID="865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.087449 4781 scope.go:117] "RemoveContainer" containerID="458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.098743 4781 scope.go:117] "RemoveContainer" containerID="b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.109807 4781 scope.go:117] "RemoveContainer" containerID="6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0" Dec 02 09:35:21 crc kubenswrapper[4781]: E1202 09:35:21.112278 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0\": container with ID starting with 6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0 not found: ID does not exist" containerID="6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.112306 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0"} err="failed to get container status \"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0\": rpc error: code = NotFound desc = could not find container \"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0\": container with ID starting with 6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.112328 4781 scope.go:117] "RemoveContainer" containerID="355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d" Dec 02 09:35:21 crc kubenswrapper[4781]: E1202 09:35:21.112590 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d\": container with ID starting with 355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d not found: ID does not exist" containerID="355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.112631 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d"} err="failed to get container status \"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d\": rpc error: code = NotFound desc = could not find container \"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d\": container with ID starting with 355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.112659 4781 scope.go:117] "RemoveContainer" containerID="17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726" Dec 02 09:35:21 crc kubenswrapper[4781]: E1202 09:35:21.112994 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\": container with ID starting with 17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726 not found: ID does not exist" containerID="17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.113019 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726"} err="failed to get container status \"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\": rpc error: code = NotFound desc = could not find container \"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\": container with ID starting with 17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.113037 4781 scope.go:117] "RemoveContainer" containerID="e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd" Dec 02 09:35:21 crc kubenswrapper[4781]: E1202 09:35:21.113301 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\": container with ID starting with e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd not found: ID does not exist" containerID="e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.113339 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd"} err="failed to get container status \"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\": rpc error: code = NotFound desc = could not find container \"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\": container with ID starting with e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.113362 4781 scope.go:117] "RemoveContainer" containerID="a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1" Dec 02 09:35:21 crc kubenswrapper[4781]: E1202 09:35:21.113630 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\": container with ID starting with a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1 not found: ID does not exist" containerID="a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.113650 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1"} err="failed to get container status \"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\": rpc error: code = NotFound desc = could not find container \"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\": container with ID starting with a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.113664 4781 scope.go:117] "RemoveContainer" containerID="7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61" Dec 02 09:35:21 crc kubenswrapper[4781]: E1202 09:35:21.114057 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\": container with ID starting with 7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61 not found: ID does not exist" containerID="7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.114075 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61"} err="failed to get container status \"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\": rpc error: code = NotFound desc = could not find container \"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\": container with ID starting with 7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.114088 4781 scope.go:117] "RemoveContainer" containerID="506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7" Dec 02 09:35:21 crc kubenswrapper[4781]: E1202 09:35:21.114384 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\": container with ID starting with 506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7 not found: ID does not exist" containerID="506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.114410 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7"} err="failed to get container status \"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\": rpc error: code = NotFound desc = could not find container \"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\": container with ID starting with 506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.114424 4781 scope.go:117] "RemoveContainer" containerID="865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91" Dec 02 09:35:21 crc kubenswrapper[4781]: E1202 09:35:21.114708 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\": container with ID starting with 865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91 not found: ID does not exist" containerID="865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.114731 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91"} err="failed to get container status \"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\": rpc error: code = NotFound desc = could not find container \"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\": container with ID starting with 865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.114746 4781 scope.go:117] "RemoveContainer" containerID="458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec" Dec 02 09:35:21 crc kubenswrapper[4781]: E1202 09:35:21.114973 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\": container with ID starting with 458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec not found: ID does not exist" containerID="458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.114993 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec"} err="failed to get container status \"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\": rpc error: code = NotFound desc = could not find container \"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\": container with ID starting with 458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.115007 4781 scope.go:117] "RemoveContainer" containerID="b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9" Dec 02 09:35:21 crc kubenswrapper[4781]: E1202 09:35:21.115218 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\": container with ID starting with b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9 not found: ID does not exist" containerID="b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.115251 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9"} err="failed to get container status \"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\": rpc error: code = NotFound desc = could not find container \"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\": container with ID starting with b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.115269 4781 scope.go:117] "RemoveContainer" containerID="6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.115495 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0"} err="failed to get container status \"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0\": rpc error: code = NotFound desc = could not find container \"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0\": container with ID starting with 6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.115513 4781 scope.go:117] "RemoveContainer" containerID="355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.115793 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d"} err="failed to get container status \"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d\": rpc error: code = NotFound desc = could not find container \"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d\": container with ID starting with 355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.115811 4781 scope.go:117] "RemoveContainer" containerID="17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.116207 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726"} err="failed to get container status \"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\": rpc error: code = NotFound desc = could not find container \"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\": container with ID starting with 17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.116226 4781 scope.go:117] "RemoveContainer" containerID="e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.116561 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd"} err="failed to get container status \"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\": rpc error: code = NotFound desc = could not find container \"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\": container with ID starting with e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.116577 4781 scope.go:117] "RemoveContainer" containerID="a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.116805 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1"} err="failed to get container status \"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\": rpc error: code = NotFound desc = could not find container \"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\": container with ID starting with a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.116825 4781 scope.go:117] "RemoveContainer" containerID="7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.117098 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61"} err="failed to get container status \"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\": rpc error: code = NotFound desc = could not find container \"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\": container with ID starting with 7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.117118 4781 scope.go:117] "RemoveContainer" containerID="506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.117374 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7"} err="failed to get container status \"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\": rpc error: code = NotFound desc = could not find container \"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\": container with ID starting with 506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.117394 4781 scope.go:117] "RemoveContainer" containerID="865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.117590 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91"} err="failed to get container status \"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\": rpc error: code = NotFound desc = could not find container \"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\": container with ID starting with 865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.117609 4781 scope.go:117] "RemoveContainer" containerID="458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.117813 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec"} err="failed to get container status \"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\": rpc error: code = NotFound desc = could not find container \"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\": container with ID starting with 458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.117829 4781 scope.go:117] "RemoveContainer" containerID="b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.118277 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9"} err="failed to get container status \"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\": rpc error: code = NotFound desc = could not find container \"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\": container with ID starting with b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.118293 4781 scope.go:117] "RemoveContainer" containerID="6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.118502 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0"} err="failed to get container status \"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0\": rpc error: code = NotFound desc = could not find container \"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0\": container with ID starting with 6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.118519 4781 scope.go:117] "RemoveContainer" containerID="355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.118767 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d"} err="failed to get container status \"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d\": rpc error: code = NotFound desc = could not find container \"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d\": container with ID starting with 355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.118784 4781 scope.go:117] "RemoveContainer" containerID="17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.119026 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726"} err="failed to get container status \"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\": rpc error: code = NotFound desc = could not find container \"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\": container with ID starting with 17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.119042 4781 scope.go:117] "RemoveContainer" containerID="e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.119373 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd"} err="failed to get container status \"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\": rpc error: code = NotFound desc = could not find container \"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\": container with ID starting with e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.119394 4781 scope.go:117] "RemoveContainer" containerID="a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.119738 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1"} err="failed to get container status \"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\": rpc error: code = NotFound desc = could not find container \"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\": container with ID starting with a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.119753 4781 scope.go:117] "RemoveContainer" containerID="7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.119979 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61"} err="failed to get container status \"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\": rpc error: code = NotFound desc = could not find container \"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\": container with ID starting with 7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.120001 4781 scope.go:117] "RemoveContainer" containerID="506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.120238 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7"} err="failed to get container status \"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\": rpc error: code = NotFound desc = could not find container \"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\": container with ID starting with 506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.120256 4781 scope.go:117] "RemoveContainer" containerID="865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.120466 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91"} err="failed to get container status \"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\": rpc error: code = NotFound desc = could not find container \"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\": container with ID starting with 865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.120482 4781 scope.go:117] "RemoveContainer" containerID="458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.120716 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec"} err="failed to get container status \"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\": rpc error: code = NotFound desc = could not find container \"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\": container with ID starting with 458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.120732 4781 scope.go:117] "RemoveContainer" containerID="b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.120967 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9"} err="failed to get container status \"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\": rpc error: code = NotFound desc = could not find container \"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\": container with ID starting with b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.120984 4781 scope.go:117] "RemoveContainer" containerID="6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.121187 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0"} err="failed to get container status \"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0\": rpc error: code = NotFound desc = could not find container \"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0\": container with ID starting with 6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.121203 4781 scope.go:117] "RemoveContainer" containerID="355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.121415 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d"} err="failed to get container status \"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d\": rpc error: code = NotFound desc = could not find container \"355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d\": container with ID starting with 355b92d44b96773d00b0e645123bf2086d26aa043b6aef98a6c90d25c7b52e3d not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.121433 4781 scope.go:117] "RemoveContainer" containerID="17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.121709 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726"} err="failed to get container status \"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\": rpc error: code = NotFound desc = could not find container \"17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726\": container with ID starting with 17cbd533b4320d178a3ee1bb3d666d5a1ad0a1d7e9f38e58735acac86c590726 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.121729 4781 scope.go:117] "RemoveContainer" containerID="e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.121964 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd"} err="failed to get container status \"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\": rpc error: code = NotFound desc = could not find container \"e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd\": container with ID starting with e5c85414a97917ad2db6270144c476b16b50ad3256bf6c694c228b8756c379dd not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.121982 4781 scope.go:117] "RemoveContainer" containerID="a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.122264 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1"} err="failed to get container status \"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\": rpc error: code = NotFound desc = could not find container \"a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1\": container with ID starting with a6086fa7a75ccf19fcc7b89e83a2dbe12ed15c729bb8a4209c18acd54c2070a1 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.122285 4781 scope.go:117] "RemoveContainer" containerID="7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.122547 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61"} err="failed to get container status \"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\": rpc error: code = NotFound desc = could not find container \"7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61\": container with ID starting with 7c8c9c60e0354d49255e6b14da6808152f6528f0a51ce992709a7014b9f64d61 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.122565 4781 scope.go:117] "RemoveContainer" containerID="506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.122792 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7"} err="failed to get container status \"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\": rpc error: code = NotFound desc = could not find container \"506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7\": container with ID starting with 506fd3b326fdaebe64874c208bb15c9cbf5aff36602f221f27f87fe0214ce6f7 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.122815 4781 scope.go:117] "RemoveContainer" containerID="865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.123058 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91"} err="failed to get container status \"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\": rpc error: code = NotFound desc = could not find container \"865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91\": container with ID starting with 865e78637e3692046c9c8a24f5238eb9636bc7e388cea6e0fec3c5c68d21cf91 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.123078 4781 scope.go:117] "RemoveContainer" containerID="458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.123279 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec"} err="failed to get container status \"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\": rpc error: code = NotFound desc = could not find container \"458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec\": container with ID starting with 458448bab8d0e21295e859a857d3a3273cc4f430f2fc0fdfb3e10b9a9ab813ec not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.123302 4781 scope.go:117] "RemoveContainer" containerID="b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.123524 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9"} err="failed to get container status \"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\": rpc error: code = NotFound desc = could not find container \"b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9\": container with ID starting with b7d67d739a501e5bfc7ab476f40d99fc1e85891dd1e52ed3888c89f6dc5f95e9 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.123541 4781 scope.go:117] "RemoveContainer" containerID="6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.123756 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0"} err="failed to get container status \"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0\": rpc error: code = NotFound desc = could not find container \"6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0\": container with ID starting with 6c718dfb2fa2c3cfc45793e7e2018f908f93424f65220ca8e33d960b1df998f0 not found: ID does not exist" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.513745 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ba2af9-1f67-4b6d-884a-666ef4f55bf3" path="/var/lib/kubelet/pods/20ba2af9-1f67-4b6d-884a-666ef4f55bf3/volumes" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.902159 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8b6p8_d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650/kube-multus/2.log" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.902792 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8b6p8_d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650/kube-multus/1.log" Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.902877 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8b6p8" event={"ID":"d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650","Type":"ContainerStarted","Data":"094a8a9387c8cd69b59eb4db0d713238406bbc891e806b4b9d9c65e020091c83"} Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.908056 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" event={"ID":"4de58e6c-4cd7-4987-bdca-aa99fb83bf43","Type":"ContainerStarted","Data":"b3916ae1b015ce29188f93fbcd1c733f2bf3c34b07aa44e8cf33b3ddf2476c13"} Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.908088 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" event={"ID":"4de58e6c-4cd7-4987-bdca-aa99fb83bf43","Type":"ContainerStarted","Data":"344ae5588524089604cb92cfbee9a883874bea06dcae0ed46cecf368998f87c2"} Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.908102 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" event={"ID":"4de58e6c-4cd7-4987-bdca-aa99fb83bf43","Type":"ContainerStarted","Data":"f832a222bad9f4fb6bd5ea6060ae360d5b5fd2908be17d202d966bb5605f9428"} Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.908115 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" event={"ID":"4de58e6c-4cd7-4987-bdca-aa99fb83bf43","Type":"ContainerStarted","Data":"17ef2b28ce23b4edb4e37a364617136b7476414c631b310dbcea42cf6c9de018"} Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.908128 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" event={"ID":"4de58e6c-4cd7-4987-bdca-aa99fb83bf43","Type":"ContainerStarted","Data":"d91613716d9b2fd74bf63accab6f863f697230ab81acf46789ae2c158e0367e8"} Dec 02 09:35:21 crc kubenswrapper[4781]: I1202 09:35:21.908139 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" event={"ID":"4de58e6c-4cd7-4987-bdca-aa99fb83bf43","Type":"ContainerStarted","Data":"5959f4dd60c3240c9179258609baba9fdf3bbbe4bdba2551fe05d5635181e69e"} Dec 02 09:35:24 crc kubenswrapper[4781]: I1202 09:35:24.927400 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" event={"ID":"4de58e6c-4cd7-4987-bdca-aa99fb83bf43","Type":"ContainerStarted","Data":"c45ec1bbec0353c8c86a0eec0813cdc1128b9bdcade49ba4afcc352ca9a23ba9"} Dec 02 09:35:26 crc kubenswrapper[4781]: I1202 09:35:26.942365 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" event={"ID":"4de58e6c-4cd7-4987-bdca-aa99fb83bf43","Type":"ContainerStarted","Data":"57cc3be5322758bc3a22f6ac9318826c02c2c5d6e1372924421e126971693e21"} Dec 02 09:35:26 crc kubenswrapper[4781]: I1202 09:35:26.942835 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:26 crc kubenswrapper[4781]: I1202 09:35:26.942936 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:26 crc kubenswrapper[4781]: I1202 09:35:26.942949 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:26 crc kubenswrapper[4781]: I1202 09:35:26.984778 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" podStartSLOduration=6.98475108 podStartE2EDuration="6.98475108s" podCreationTimestamp="2025-12-02 09:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:35:26.978534912 +0000 UTC m=+889.802408811" watchObservedRunningTime="2025-12-02 09:35:26.98475108 +0000 UTC m=+889.808624999" Dec 02 09:35:26 crc kubenswrapper[4781]: I1202 09:35:26.988617 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:26 crc kubenswrapper[4781]: I1202 09:35:26.991780 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:39 crc kubenswrapper[4781]: I1202 09:35:39.222385 4781 scope.go:117] "RemoveContainer" containerID="a029cea19e689697b3e833a0bd6c745475464429e227868f4cabed7c7fe3ec71" Dec 02 09:35:41 crc kubenswrapper[4781]: I1202 09:35:41.248479 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8b6p8_d24e1112-c2d8-4a8e-8fe5-8a5b6a41c650/kube-multus/2.log" Dec 02 09:35:50 crc kubenswrapper[4781]: I1202 09:35:50.606062 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hmqqc" Dec 02 09:35:54 crc kubenswrapper[4781]: I1202 09:35:54.775011 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c"] Dec 02 09:35:54 crc kubenswrapper[4781]: I1202 09:35:54.776319 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" Dec 02 09:35:54 crc kubenswrapper[4781]: I1202 09:35:54.777847 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 09:35:54 crc kubenswrapper[4781]: I1202 09:35:54.783845 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c"] Dec 02 09:35:54 crc kubenswrapper[4781]: I1202 09:35:54.910169 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8add967d-359e-4b2f-8181-27a4e32cd3d1-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c\" (UID: \"8add967d-359e-4b2f-8181-27a4e32cd3d1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" Dec 02 09:35:54 crc kubenswrapper[4781]: I1202 09:35:54.910263 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8add967d-359e-4b2f-8181-27a4e32cd3d1-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c\" (UID: \"8add967d-359e-4b2f-8181-27a4e32cd3d1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" Dec 02 09:35:54 crc kubenswrapper[4781]: I1202 09:35:54.910370 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n57c4\" (UniqueName: \"kubernetes.io/projected/8add967d-359e-4b2f-8181-27a4e32cd3d1-kube-api-access-n57c4\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c\" (UID: \"8add967d-359e-4b2f-8181-27a4e32cd3d1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" Dec 02 09:35:55 crc kubenswrapper[4781]: I1202 09:35:55.011147 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8add967d-359e-4b2f-8181-27a4e32cd3d1-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c\" (UID: \"8add967d-359e-4b2f-8181-27a4e32cd3d1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" Dec 02 09:35:55 crc kubenswrapper[4781]: I1202 09:35:55.011235 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n57c4\" (UniqueName: \"kubernetes.io/projected/8add967d-359e-4b2f-8181-27a4e32cd3d1-kube-api-access-n57c4\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c\" (UID: \"8add967d-359e-4b2f-8181-27a4e32cd3d1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" Dec 02 09:35:55 crc kubenswrapper[4781]: I1202 09:35:55.011308 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8add967d-359e-4b2f-8181-27a4e32cd3d1-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c\" (UID: \"8add967d-359e-4b2f-8181-27a4e32cd3d1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" Dec 02 09:35:55 crc kubenswrapper[4781]: I1202 09:35:55.011616 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8add967d-359e-4b2f-8181-27a4e32cd3d1-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c\" (UID: \"8add967d-359e-4b2f-8181-27a4e32cd3d1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" Dec 02 09:35:55 crc kubenswrapper[4781]: I1202 09:35:55.011775 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8add967d-359e-4b2f-8181-27a4e32cd3d1-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c\" (UID: \"8add967d-359e-4b2f-8181-27a4e32cd3d1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" Dec 02 09:35:55 crc kubenswrapper[4781]: I1202 09:35:55.031186 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n57c4\" (UniqueName: \"kubernetes.io/projected/8add967d-359e-4b2f-8181-27a4e32cd3d1-kube-api-access-n57c4\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c\" (UID: \"8add967d-359e-4b2f-8181-27a4e32cd3d1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" Dec 02 09:35:55 crc kubenswrapper[4781]: I1202 09:35:55.095316 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" Dec 02 09:35:55 crc kubenswrapper[4781]: I1202 09:35:55.326636 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c"] Dec 02 09:35:56 crc kubenswrapper[4781]: I1202 09:35:56.332036 4781 generic.go:334] "Generic (PLEG): container finished" podID="8add967d-359e-4b2f-8181-27a4e32cd3d1" containerID="132b8c6202768b003d8d1d00f3f6c71d39041af4056891286461426ea799e151" exitCode=0 Dec 02 09:35:56 crc kubenswrapper[4781]: I1202 09:35:56.332084 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" event={"ID":"8add967d-359e-4b2f-8181-27a4e32cd3d1","Type":"ContainerDied","Data":"132b8c6202768b003d8d1d00f3f6c71d39041af4056891286461426ea799e151"} Dec 02 09:35:56 crc kubenswrapper[4781]: I1202 09:35:56.332223 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" event={"ID":"8add967d-359e-4b2f-8181-27a4e32cd3d1","Type":"ContainerStarted","Data":"2db2b25007bafaf39924fa4d76fa05f3a23625c6c0de85cb3dfeccbedb96b110"} Dec 02 09:35:57 crc kubenswrapper[4781]: I1202 09:35:57.034536 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gszsj"] Dec 02 09:35:57 crc kubenswrapper[4781]: I1202 09:35:57.035831 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gszsj" Dec 02 09:35:57 crc kubenswrapper[4781]: I1202 09:35:57.046150 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gszsj"] Dec 02 09:35:57 crc kubenswrapper[4781]: I1202 09:35:57.234465 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-catalog-content\") pod \"redhat-operators-gszsj\" (UID: \"10c35de0-81dc-4710-9a2e-17cf6e37ed2f\") " pod="openshift-marketplace/redhat-operators-gszsj" Dec 02 09:35:57 crc kubenswrapper[4781]: I1202 09:35:57.234560 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nkn9\" (UniqueName: \"kubernetes.io/projected/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-kube-api-access-9nkn9\") pod \"redhat-operators-gszsj\" (UID: \"10c35de0-81dc-4710-9a2e-17cf6e37ed2f\") " pod="openshift-marketplace/redhat-operators-gszsj" Dec 02 09:35:57 crc kubenswrapper[4781]: I1202 09:35:57.234676 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-utilities\") pod \"redhat-operators-gszsj\" (UID: \"10c35de0-81dc-4710-9a2e-17cf6e37ed2f\") " pod="openshift-marketplace/redhat-operators-gszsj" Dec 02 09:35:57 crc kubenswrapper[4781]: I1202 09:35:57.335552 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-catalog-content\") pod \"redhat-operators-gszsj\" (UID: \"10c35de0-81dc-4710-9a2e-17cf6e37ed2f\") " pod="openshift-marketplace/redhat-operators-gszsj" Dec 02 09:35:57 crc kubenswrapper[4781]: I1202 09:35:57.335611 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nkn9\" (UniqueName: \"kubernetes.io/projected/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-kube-api-access-9nkn9\") pod \"redhat-operators-gszsj\" (UID: \"10c35de0-81dc-4710-9a2e-17cf6e37ed2f\") " pod="openshift-marketplace/redhat-operators-gszsj" Dec 02 09:35:57 crc kubenswrapper[4781]: I1202 09:35:57.335653 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-utilities\") pod \"redhat-operators-gszsj\" (UID: \"10c35de0-81dc-4710-9a2e-17cf6e37ed2f\") " pod="openshift-marketplace/redhat-operators-gszsj" Dec 02 09:35:57 crc kubenswrapper[4781]: I1202 09:35:57.336306 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-utilities\") pod \"redhat-operators-gszsj\" (UID: \"10c35de0-81dc-4710-9a2e-17cf6e37ed2f\") " pod="openshift-marketplace/redhat-operators-gszsj" Dec 02 09:35:57 crc kubenswrapper[4781]: I1202 09:35:57.336412 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-catalog-content\") pod \"redhat-operators-gszsj\" (UID: \"10c35de0-81dc-4710-9a2e-17cf6e37ed2f\") " pod="openshift-marketplace/redhat-operators-gszsj" Dec 02 09:35:57 crc kubenswrapper[4781]: I1202 09:35:57.357046 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nkn9\" (UniqueName: \"kubernetes.io/projected/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-kube-api-access-9nkn9\") pod \"redhat-operators-gszsj\" (UID: \"10c35de0-81dc-4710-9a2e-17cf6e37ed2f\") " pod="openshift-marketplace/redhat-operators-gszsj" Dec 02 09:35:57 crc kubenswrapper[4781]: I1202 09:35:57.656668 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gszsj" Dec 02 09:35:57 crc kubenswrapper[4781]: I1202 09:35:57.886494 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gszsj"] Dec 02 09:35:57 crc kubenswrapper[4781]: W1202 09:35:57.947823 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c35de0_81dc_4710_9a2e_17cf6e37ed2f.slice/crio-9cd58f6dc0a0ec470ef76dcc2c1a9a4ac07eb41e2b1a444ff62042148205c8f2 WatchSource:0}: Error finding container 9cd58f6dc0a0ec470ef76dcc2c1a9a4ac07eb41e2b1a444ff62042148205c8f2: Status 404 returned error can't find the container with id 9cd58f6dc0a0ec470ef76dcc2c1a9a4ac07eb41e2b1a444ff62042148205c8f2 Dec 02 09:35:58 crc kubenswrapper[4781]: I1202 09:35:58.354678 4781 generic.go:334] "Generic (PLEG): container finished" podID="10c35de0-81dc-4710-9a2e-17cf6e37ed2f" containerID="cfed86336a5497dd7d0f028f0e512f6c3ec9842c6467b4d3bb33d4dc2455026f" exitCode=0 Dec 02 09:35:58 crc kubenswrapper[4781]: I1202 09:35:58.354735 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gszsj" event={"ID":"10c35de0-81dc-4710-9a2e-17cf6e37ed2f","Type":"ContainerDied","Data":"cfed86336a5497dd7d0f028f0e512f6c3ec9842c6467b4d3bb33d4dc2455026f"} Dec 02 09:35:58 crc kubenswrapper[4781]: I1202 09:35:58.354783 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gszsj" event={"ID":"10c35de0-81dc-4710-9a2e-17cf6e37ed2f","Type":"ContainerStarted","Data":"9cd58f6dc0a0ec470ef76dcc2c1a9a4ac07eb41e2b1a444ff62042148205c8f2"} Dec 02 09:35:58 crc kubenswrapper[4781]: I1202 09:35:58.358965 4781 generic.go:334] "Generic (PLEG): container finished" podID="8add967d-359e-4b2f-8181-27a4e32cd3d1" containerID="61f62cce905fc1e9581ddf6169f1a9be588d2e753f0066707c7de14b0d775c41" exitCode=0 Dec 02 09:35:58 crc kubenswrapper[4781]: I1202 09:35:58.360088 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" event={"ID":"8add967d-359e-4b2f-8181-27a4e32cd3d1","Type":"ContainerDied","Data":"61f62cce905fc1e9581ddf6169f1a9be588d2e753f0066707c7de14b0d775c41"} Dec 02 09:35:59 crc kubenswrapper[4781]: I1202 09:35:59.367003 4781 generic.go:334] "Generic (PLEG): container finished" podID="8add967d-359e-4b2f-8181-27a4e32cd3d1" containerID="72c914bcdd9efddd6ff7cafab7ca4fc5ef87d138bcbcf314ee89341b996d2bb8" exitCode=0 Dec 02 09:35:59 crc kubenswrapper[4781]: I1202 09:35:59.367043 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" event={"ID":"8add967d-359e-4b2f-8181-27a4e32cd3d1","Type":"ContainerDied","Data":"72c914bcdd9efddd6ff7cafab7ca4fc5ef87d138bcbcf314ee89341b996d2bb8"} Dec 02 09:36:00 crc kubenswrapper[4781]: I1202 09:36:00.376367 4781 generic.go:334] "Generic (PLEG): container finished" podID="10c35de0-81dc-4710-9a2e-17cf6e37ed2f" containerID="71c673cebca85746b28a6bef11117463af7e80b18522595d105e4e895918d6b8" exitCode=0 Dec 02 09:36:00 crc kubenswrapper[4781]: I1202 09:36:00.376435 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gszsj" event={"ID":"10c35de0-81dc-4710-9a2e-17cf6e37ed2f","Type":"ContainerDied","Data":"71c673cebca85746b28a6bef11117463af7e80b18522595d105e4e895918d6b8"} Dec 02 09:36:00 crc kubenswrapper[4781]: I1202 09:36:00.638378 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" Dec 02 09:36:00 crc kubenswrapper[4781]: I1202 09:36:00.787859 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8add967d-359e-4b2f-8181-27a4e32cd3d1-bundle\") pod \"8add967d-359e-4b2f-8181-27a4e32cd3d1\" (UID: \"8add967d-359e-4b2f-8181-27a4e32cd3d1\") " Dec 02 09:36:00 crc kubenswrapper[4781]: I1202 09:36:00.788031 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8add967d-359e-4b2f-8181-27a4e32cd3d1-util\") pod \"8add967d-359e-4b2f-8181-27a4e32cd3d1\" (UID: \"8add967d-359e-4b2f-8181-27a4e32cd3d1\") " Dec 02 09:36:00 crc kubenswrapper[4781]: I1202 09:36:00.788117 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n57c4\" (UniqueName: \"kubernetes.io/projected/8add967d-359e-4b2f-8181-27a4e32cd3d1-kube-api-access-n57c4\") pod \"8add967d-359e-4b2f-8181-27a4e32cd3d1\" (UID: \"8add967d-359e-4b2f-8181-27a4e32cd3d1\") " Dec 02 09:36:00 crc kubenswrapper[4781]: I1202 09:36:00.792646 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8add967d-359e-4b2f-8181-27a4e32cd3d1-bundle" (OuterVolumeSpecName: "bundle") pod "8add967d-359e-4b2f-8181-27a4e32cd3d1" (UID: "8add967d-359e-4b2f-8181-27a4e32cd3d1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:36:00 crc kubenswrapper[4781]: I1202 09:36:00.800871 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8add967d-359e-4b2f-8181-27a4e32cd3d1-util" (OuterVolumeSpecName: "util") pod "8add967d-359e-4b2f-8181-27a4e32cd3d1" (UID: "8add967d-359e-4b2f-8181-27a4e32cd3d1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:36:00 crc kubenswrapper[4781]: I1202 09:36:00.804134 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8add967d-359e-4b2f-8181-27a4e32cd3d1-kube-api-access-n57c4" (OuterVolumeSpecName: "kube-api-access-n57c4") pod "8add967d-359e-4b2f-8181-27a4e32cd3d1" (UID: "8add967d-359e-4b2f-8181-27a4e32cd3d1"). InnerVolumeSpecName "kube-api-access-n57c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:36:00 crc kubenswrapper[4781]: I1202 09:36:00.890467 4781 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8add967d-359e-4b2f-8181-27a4e32cd3d1-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:36:00 crc kubenswrapper[4781]: I1202 09:36:00.890495 4781 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8add967d-359e-4b2f-8181-27a4e32cd3d1-util\") on node \"crc\" DevicePath \"\"" Dec 02 09:36:00 crc kubenswrapper[4781]: I1202 09:36:00.890505 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n57c4\" (UniqueName: \"kubernetes.io/projected/8add967d-359e-4b2f-8181-27a4e32cd3d1-kube-api-access-n57c4\") on node \"crc\" DevicePath \"\"" Dec 02 09:36:01 crc kubenswrapper[4781]: I1202 09:36:01.386086 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" event={"ID":"8add967d-359e-4b2f-8181-27a4e32cd3d1","Type":"ContainerDied","Data":"2db2b25007bafaf39924fa4d76fa05f3a23625c6c0de85cb3dfeccbedb96b110"} Dec 02 09:36:01 crc kubenswrapper[4781]: I1202 09:36:01.386143 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2db2b25007bafaf39924fa4d76fa05f3a23625c6c0de85cb3dfeccbedb96b110" Dec 02 09:36:01 crc kubenswrapper[4781]: I1202 09:36:01.386110 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c" Dec 02 09:36:01 crc kubenswrapper[4781]: I1202 09:36:01.388597 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gszsj" event={"ID":"10c35de0-81dc-4710-9a2e-17cf6e37ed2f","Type":"ContainerStarted","Data":"db1840359698fd4d213059895ebbb0f178bd7826b57f01b5dec60289c1d4e0d4"} Dec 02 09:36:01 crc kubenswrapper[4781]: I1202 09:36:01.411773 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gszsj" podStartSLOduration=1.884162848 podStartE2EDuration="4.411755321s" podCreationTimestamp="2025-12-02 09:35:57 +0000 UTC" firstStartedPulling="2025-12-02 09:35:58.357051443 +0000 UTC m=+921.180925332" lastFinishedPulling="2025-12-02 09:36:00.884643926 +0000 UTC m=+923.708517805" observedRunningTime="2025-12-02 09:36:01.404723981 +0000 UTC m=+924.228597900" watchObservedRunningTime="2025-12-02 09:36:01.411755321 +0000 UTC m=+924.235629210" Dec 02 09:36:02 crc kubenswrapper[4781]: I1202 09:36:02.728959 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-99vk9"] Dec 02 09:36:02 crc kubenswrapper[4781]: E1202 09:36:02.731350 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8add967d-359e-4b2f-8181-27a4e32cd3d1" containerName="pull" Dec 02 09:36:02 crc kubenswrapper[4781]: I1202 09:36:02.731536 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8add967d-359e-4b2f-8181-27a4e32cd3d1" containerName="pull" Dec 02 09:36:02 crc kubenswrapper[4781]: E1202 09:36:02.731712 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8add967d-359e-4b2f-8181-27a4e32cd3d1" containerName="extract" Dec 02 09:36:02 crc kubenswrapper[4781]: I1202 09:36:02.731822 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8add967d-359e-4b2f-8181-27a4e32cd3d1" containerName="extract" Dec 02 09:36:02 crc kubenswrapper[4781]: E1202 09:36:02.731961 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8add967d-359e-4b2f-8181-27a4e32cd3d1" containerName="util" Dec 02 09:36:02 crc kubenswrapper[4781]: I1202 09:36:02.732087 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8add967d-359e-4b2f-8181-27a4e32cd3d1" containerName="util" Dec 02 09:36:02 crc kubenswrapper[4781]: I1202 09:36:02.732350 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8add967d-359e-4b2f-8181-27a4e32cd3d1" containerName="extract" Dec 02 09:36:02 crc kubenswrapper[4781]: I1202 09:36:02.733112 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-99vk9" Dec 02 09:36:02 crc kubenswrapper[4781]: I1202 09:36:02.739612 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 02 09:36:02 crc kubenswrapper[4781]: I1202 09:36:02.740001 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 02 09:36:02 crc kubenswrapper[4781]: I1202 09:36:02.740094 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-cjth2" Dec 02 09:36:02 crc kubenswrapper[4781]: I1202 09:36:02.748255 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-99vk9"] Dec 02 09:36:02 crc kubenswrapper[4781]: I1202 09:36:02.914344 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67f6v\" (UniqueName: \"kubernetes.io/projected/11d94cf1-4ea7-43cb-b7e5-6fc0be34760f-kube-api-access-67f6v\") pod \"nmstate-operator-5b5b58f5c8-99vk9\" (UID: \"11d94cf1-4ea7-43cb-b7e5-6fc0be34760f\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-99vk9" Dec 02 09:36:03 crc kubenswrapper[4781]: I1202 09:36:03.015289 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67f6v\" (UniqueName: \"kubernetes.io/projected/11d94cf1-4ea7-43cb-b7e5-6fc0be34760f-kube-api-access-67f6v\") pod \"nmstate-operator-5b5b58f5c8-99vk9\" (UID: \"11d94cf1-4ea7-43cb-b7e5-6fc0be34760f\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-99vk9" Dec 02 09:36:03 crc kubenswrapper[4781]: I1202 09:36:03.046904 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67f6v\" (UniqueName: \"kubernetes.io/projected/11d94cf1-4ea7-43cb-b7e5-6fc0be34760f-kube-api-access-67f6v\") pod \"nmstate-operator-5b5b58f5c8-99vk9\" (UID: \"11d94cf1-4ea7-43cb-b7e5-6fc0be34760f\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-99vk9" Dec 02 09:36:03 crc kubenswrapper[4781]: I1202 09:36:03.083842 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-99vk9" Dec 02 09:36:03 crc kubenswrapper[4781]: I1202 09:36:03.462812 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-99vk9"] Dec 02 09:36:04 crc kubenswrapper[4781]: I1202 09:36:04.407057 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-99vk9" event={"ID":"11d94cf1-4ea7-43cb-b7e5-6fc0be34760f","Type":"ContainerStarted","Data":"a08b7c8867a66906f8a3ec7a50d7cee92332a27b863ebf6a57d1f48f5445759a"} Dec 02 09:36:07 crc kubenswrapper[4781]: I1202 09:36:07.424539 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-99vk9" event={"ID":"11d94cf1-4ea7-43cb-b7e5-6fc0be34760f","Type":"ContainerStarted","Data":"34cda6cc0db20c9bb9e3f0b6a9f50fb1cfa10b7feacf07169c0a7568be76305e"} Dec 02 09:36:07 crc kubenswrapper[4781]: I1202 09:36:07.448651 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-99vk9" podStartSLOduration=2.108940733 podStartE2EDuration="5.448618931s" podCreationTimestamp="2025-12-02 09:36:02 +0000 UTC" firstStartedPulling="2025-12-02 09:36:03.473910368 +0000 UTC m=+926.297784247" lastFinishedPulling="2025-12-02 09:36:06.813588576 +0000 UTC m=+929.637462445" observedRunningTime="2025-12-02 09:36:07.443336579 +0000 UTC m=+930.267210488" watchObservedRunningTime="2025-12-02 09:36:07.448618931 +0000 UTC m=+930.272492840" Dec 02 09:36:07 crc kubenswrapper[4781]: I1202 09:36:07.657838 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gszsj" Dec 02 09:36:07 crc kubenswrapper[4781]: I1202 09:36:07.657944 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gszsj" Dec 02 09:36:07 crc kubenswrapper[4781]: I1202 09:36:07.706860 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gszsj" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.466466 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-b4bs8"] Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.467509 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-b4bs8" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.469539 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-z6r4l" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.487523 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-dq5sv"] Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.488771 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-dq5sv" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.492407 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.493848 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-b4bs8"] Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.496366 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gszsj" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.503490 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-h2jq9"] Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.504382 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-h2jq9" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.524633 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-dq5sv"] Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.577258 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc6bq\" (UniqueName: \"kubernetes.io/projected/71f173d9-47d0-4576-991f-0eeffb003596-kube-api-access-gc6bq\") pod \"nmstate-metrics-7f946cbc9-b4bs8\" (UID: \"71f173d9-47d0-4576-991f-0eeffb003596\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-b4bs8" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.610752 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x"] Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.611575 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.620414 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x"] Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.623874 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-ngswp" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.624283 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.624503 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.678728 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/acd034d2-af9c-49ff-a584-f7f0ef482c10-dbus-socket\") pod \"nmstate-handler-h2jq9\" (UID: \"acd034d2-af9c-49ff-a584-f7f0ef482c10\") " pod="openshift-nmstate/nmstate-handler-h2jq9" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.678818 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc6bq\" (UniqueName: \"kubernetes.io/projected/71f173d9-47d0-4576-991f-0eeffb003596-kube-api-access-gc6bq\") pod \"nmstate-metrics-7f946cbc9-b4bs8\" (UID: \"71f173d9-47d0-4576-991f-0eeffb003596\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-b4bs8" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.678860 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/acd034d2-af9c-49ff-a584-f7f0ef482c10-ovs-socket\") pod \"nmstate-handler-h2jq9\" (UID: \"acd034d2-af9c-49ff-a584-f7f0ef482c10\") " pod="openshift-nmstate/nmstate-handler-h2jq9" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.678888 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/acd034d2-af9c-49ff-a584-f7f0ef482c10-nmstate-lock\") pod \"nmstate-handler-h2jq9\" (UID: \"acd034d2-af9c-49ff-a584-f7f0ef482c10\") " pod="openshift-nmstate/nmstate-handler-h2jq9" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.678913 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m45vf\" (UniqueName: \"kubernetes.io/projected/de15e7ea-d9ce-4713-bb63-87db8b3c5afd-kube-api-access-m45vf\") pod \"nmstate-webhook-5f6d4c5ccb-dq5sv\" (UID: \"de15e7ea-d9ce-4713-bb63-87db8b3c5afd\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-dq5sv" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.679205 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq7bb\" (UniqueName: \"kubernetes.io/projected/acd034d2-af9c-49ff-a584-f7f0ef482c10-kube-api-access-lq7bb\") pod \"nmstate-handler-h2jq9\" (UID: \"acd034d2-af9c-49ff-a584-f7f0ef482c10\") " pod="openshift-nmstate/nmstate-handler-h2jq9" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.679317 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/de15e7ea-d9ce-4713-bb63-87db8b3c5afd-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-dq5sv\" (UID: \"de15e7ea-d9ce-4713-bb63-87db8b3c5afd\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-dq5sv" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.700763 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc6bq\" (UniqueName: \"kubernetes.io/projected/71f173d9-47d0-4576-991f-0eeffb003596-kube-api-access-gc6bq\") pod \"nmstate-metrics-7f946cbc9-b4bs8\" (UID: \"71f173d9-47d0-4576-991f-0eeffb003596\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-b4bs8" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.780131 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/de15e7ea-d9ce-4713-bb63-87db8b3c5afd-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-dq5sv\" (UID: \"de15e7ea-d9ce-4713-bb63-87db8b3c5afd\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-dq5sv" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.780189 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f057938-ae0d-4ce5-a920-b34905dcab9a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-xqj9x\" (UID: \"9f057938-ae0d-4ce5-a920-b34905dcab9a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.780223 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/acd034d2-af9c-49ff-a584-f7f0ef482c10-dbus-socket\") pod \"nmstate-handler-h2jq9\" (UID: \"acd034d2-af9c-49ff-a584-f7f0ef482c10\") " pod="openshift-nmstate/nmstate-handler-h2jq9" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.780247 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kwj7\" (UniqueName: \"kubernetes.io/projected/9f057938-ae0d-4ce5-a920-b34905dcab9a-kube-api-access-7kwj7\") pod \"nmstate-console-plugin-7fbb5f6569-xqj9x\" (UID: \"9f057938-ae0d-4ce5-a920-b34905dcab9a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.780286 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/acd034d2-af9c-49ff-a584-f7f0ef482c10-ovs-socket\") pod \"nmstate-handler-h2jq9\" (UID: \"acd034d2-af9c-49ff-a584-f7f0ef482c10\") " pod="openshift-nmstate/nmstate-handler-h2jq9" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.780308 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/acd034d2-af9c-49ff-a584-f7f0ef482c10-nmstate-lock\") pod \"nmstate-handler-h2jq9\" (UID: \"acd034d2-af9c-49ff-a584-f7f0ef482c10\") " pod="openshift-nmstate/nmstate-handler-h2jq9" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.780327 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m45vf\" (UniqueName: \"kubernetes.io/projected/de15e7ea-d9ce-4713-bb63-87db8b3c5afd-kube-api-access-m45vf\") pod \"nmstate-webhook-5f6d4c5ccb-dq5sv\" (UID: \"de15e7ea-d9ce-4713-bb63-87db8b3c5afd\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-dq5sv" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.780348 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9f057938-ae0d-4ce5-a920-b34905dcab9a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-xqj9x\" (UID: \"9f057938-ae0d-4ce5-a920-b34905dcab9a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.780369 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq7bb\" (UniqueName: \"kubernetes.io/projected/acd034d2-af9c-49ff-a584-f7f0ef482c10-kube-api-access-lq7bb\") pod \"nmstate-handler-h2jq9\" (UID: \"acd034d2-af9c-49ff-a584-f7f0ef482c10\") " pod="openshift-nmstate/nmstate-handler-h2jq9" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.780790 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/acd034d2-af9c-49ff-a584-f7f0ef482c10-nmstate-lock\") pod \"nmstate-handler-h2jq9\" (UID: \"acd034d2-af9c-49ff-a584-f7f0ef482c10\") " pod="openshift-nmstate/nmstate-handler-h2jq9" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.780874 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/acd034d2-af9c-49ff-a584-f7f0ef482c10-ovs-socket\") pod \"nmstate-handler-h2jq9\" (UID: \"acd034d2-af9c-49ff-a584-f7f0ef482c10\") " pod="openshift-nmstate/nmstate-handler-h2jq9" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.781033 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/acd034d2-af9c-49ff-a584-f7f0ef482c10-dbus-socket\") pod \"nmstate-handler-h2jq9\" (UID: \"acd034d2-af9c-49ff-a584-f7f0ef482c10\") " pod="openshift-nmstate/nmstate-handler-h2jq9" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.783387 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-b4bs8" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.783810 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/de15e7ea-d9ce-4713-bb63-87db8b3c5afd-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-dq5sv\" (UID: \"de15e7ea-d9ce-4713-bb63-87db8b3c5afd\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-dq5sv" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.812633 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m45vf\" (UniqueName: \"kubernetes.io/projected/de15e7ea-d9ce-4713-bb63-87db8b3c5afd-kube-api-access-m45vf\") pod \"nmstate-webhook-5f6d4c5ccb-dq5sv\" (UID: \"de15e7ea-d9ce-4713-bb63-87db8b3c5afd\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-dq5sv" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.815173 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-dq5sv" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.818353 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq7bb\" (UniqueName: \"kubernetes.io/projected/acd034d2-af9c-49ff-a584-f7f0ef482c10-kube-api-access-lq7bb\") pod \"nmstate-handler-h2jq9\" (UID: \"acd034d2-af9c-49ff-a584-f7f0ef482c10\") " pod="openshift-nmstate/nmstate-handler-h2jq9" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.824415 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-h2jq9" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.826282 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-66d6b49755-vk46c"] Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.827237 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.853717 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66d6b49755-vk46c"] Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.882296 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f057938-ae0d-4ce5-a920-b34905dcab9a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-xqj9x\" (UID: \"9f057938-ae0d-4ce5-a920-b34905dcab9a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.882651 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kwj7\" (UniqueName: \"kubernetes.io/projected/9f057938-ae0d-4ce5-a920-b34905dcab9a-kube-api-access-7kwj7\") pod \"nmstate-console-plugin-7fbb5f6569-xqj9x\" (UID: \"9f057938-ae0d-4ce5-a920-b34905dcab9a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.882829 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9f057938-ae0d-4ce5-a920-b34905dcab9a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-xqj9x\" (UID: \"9f057938-ae0d-4ce5-a920-b34905dcab9a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x" Dec 02 09:36:08 crc kubenswrapper[4781]: E1202 09:36:08.882476 4781 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 02 09:36:08 crc kubenswrapper[4781]: E1202 09:36:08.883028 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f057938-ae0d-4ce5-a920-b34905dcab9a-plugin-serving-cert podName:9f057938-ae0d-4ce5-a920-b34905dcab9a nodeName:}" failed. No retries permitted until 2025-12-02 09:36:09.383007269 +0000 UTC m=+932.206881148 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/9f057938-ae0d-4ce5-a920-b34905dcab9a-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-xqj9x" (UID: "9f057938-ae0d-4ce5-a920-b34905dcab9a") : secret "plugin-serving-cert" not found Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.884250 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9f057938-ae0d-4ce5-a920-b34905dcab9a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-xqj9x\" (UID: \"9f057938-ae0d-4ce5-a920-b34905dcab9a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.900850 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kwj7\" (UniqueName: \"kubernetes.io/projected/9f057938-ae0d-4ce5-a920-b34905dcab9a-kube-api-access-7kwj7\") pod \"nmstate-console-plugin-7fbb5f6569-xqj9x\" (UID: \"9f057938-ae0d-4ce5-a920-b34905dcab9a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.984977 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/083e89f8-c2ae-42a1-8112-38744cf35f01-service-ca\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.985451 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/083e89f8-c2ae-42a1-8112-38744cf35f01-console-config\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.985528 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8299f\" (UniqueName: \"kubernetes.io/projected/083e89f8-c2ae-42a1-8112-38744cf35f01-kube-api-access-8299f\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.985571 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/083e89f8-c2ae-42a1-8112-38744cf35f01-console-serving-cert\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.985593 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/083e89f8-c2ae-42a1-8112-38744cf35f01-console-oauth-config\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.985613 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/083e89f8-c2ae-42a1-8112-38744cf35f01-trusted-ca-bundle\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:08 crc kubenswrapper[4781]: I1202 09:36:08.985635 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/083e89f8-c2ae-42a1-8112-38744cf35f01-oauth-serving-cert\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.045776 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-b4bs8"] Dec 02 09:36:09 crc kubenswrapper[4781]: W1202 09:36:09.051295 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71f173d9_47d0_4576_991f_0eeffb003596.slice/crio-84449873fd8b9e2f6ac186963999449a4c52e89c10e84a74e9280d0c80b56e97 WatchSource:0}: Error finding container 84449873fd8b9e2f6ac186963999449a4c52e89c10e84a74e9280d0c80b56e97: Status 404 returned error can't find the container with id 84449873fd8b9e2f6ac186963999449a4c52e89c10e84a74e9280d0c80b56e97 Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.087477 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/083e89f8-c2ae-42a1-8112-38744cf35f01-console-serving-cert\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.087603 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/083e89f8-c2ae-42a1-8112-38744cf35f01-console-oauth-config\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.087644 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/083e89f8-c2ae-42a1-8112-38744cf35f01-trusted-ca-bundle\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.087708 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/083e89f8-c2ae-42a1-8112-38744cf35f01-oauth-serving-cert\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.087893 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/083e89f8-c2ae-42a1-8112-38744cf35f01-service-ca\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.087974 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/083e89f8-c2ae-42a1-8112-38744cf35f01-console-config\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.088113 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8299f\" (UniqueName: \"kubernetes.io/projected/083e89f8-c2ae-42a1-8112-38744cf35f01-kube-api-access-8299f\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.087897 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-dq5sv"] Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.089286 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/083e89f8-c2ae-42a1-8112-38744cf35f01-oauth-serving-cert\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.089423 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/083e89f8-c2ae-42a1-8112-38744cf35f01-trusted-ca-bundle\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.091253 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/083e89f8-c2ae-42a1-8112-38744cf35f01-service-ca\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.093147 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/083e89f8-c2ae-42a1-8112-38744cf35f01-console-serving-cert\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.093154 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/083e89f8-c2ae-42a1-8112-38744cf35f01-console-oauth-config\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.094749 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/083e89f8-c2ae-42a1-8112-38744cf35f01-console-config\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.107770 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8299f\" (UniqueName: \"kubernetes.io/projected/083e89f8-c2ae-42a1-8112-38744cf35f01-kube-api-access-8299f\") pod \"console-66d6b49755-vk46c\" (UID: \"083e89f8-c2ae-42a1-8112-38744cf35f01\") " pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.191341 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.382007 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66d6b49755-vk46c"] Dec 02 09:36:09 crc kubenswrapper[4781]: W1202 09:36:09.386821 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod083e89f8_c2ae_42a1_8112_38744cf35f01.slice/crio-2835da4ba894a3a7ba8332f1c66bceae5c2fd87220a8fef91f67c02ee079bd69 WatchSource:0}: Error finding container 2835da4ba894a3a7ba8332f1c66bceae5c2fd87220a8fef91f67c02ee079bd69: Status 404 returned error can't find the container with id 2835da4ba894a3a7ba8332f1c66bceae5c2fd87220a8fef91f67c02ee079bd69 Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.392061 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f057938-ae0d-4ce5-a920-b34905dcab9a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-xqj9x\" (UID: \"9f057938-ae0d-4ce5-a920-b34905dcab9a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.398578 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f057938-ae0d-4ce5-a920-b34905dcab9a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-xqj9x\" (UID: \"9f057938-ae0d-4ce5-a920-b34905dcab9a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.451579 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-b4bs8" event={"ID":"71f173d9-47d0-4576-991f-0eeffb003596","Type":"ContainerStarted","Data":"84449873fd8b9e2f6ac186963999449a4c52e89c10e84a74e9280d0c80b56e97"} Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.452498 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66d6b49755-vk46c" event={"ID":"083e89f8-c2ae-42a1-8112-38744cf35f01","Type":"ContainerStarted","Data":"2835da4ba894a3a7ba8332f1c66bceae5c2fd87220a8fef91f67c02ee079bd69"} Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.453375 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-dq5sv" event={"ID":"de15e7ea-d9ce-4713-bb63-87db8b3c5afd","Type":"ContainerStarted","Data":"f6839d69311c7bd757503d87d0221f5dab23d9a0328bf884d64eb1965d303c2a"} Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.455404 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-h2jq9" event={"ID":"acd034d2-af9c-49ff-a584-f7f0ef482c10","Type":"ContainerStarted","Data":"320d4715db323c18ad4deb608ac6a827e35425b10e6fd5e0e2199c09dc22e0c0"} Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.524849 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x" Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.743605 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x"] Dec 02 09:36:09 crc kubenswrapper[4781]: W1202 09:36:09.752100 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f057938_ae0d_4ce5_a920_b34905dcab9a.slice/crio-ca1846d539ed6b9921d88caae85af0a5203bc0da8aa87a777fdba5ab71b8480f WatchSource:0}: Error finding container ca1846d539ed6b9921d88caae85af0a5203bc0da8aa87a777fdba5ab71b8480f: Status 404 returned error can't find the container with id ca1846d539ed6b9921d88caae85af0a5203bc0da8aa87a777fdba5ab71b8480f Dec 02 09:36:09 crc kubenswrapper[4781]: I1202 09:36:09.827647 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gszsj"] Dec 02 09:36:10 crc kubenswrapper[4781]: I1202 09:36:10.462140 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x" event={"ID":"9f057938-ae0d-4ce5-a920-b34905dcab9a","Type":"ContainerStarted","Data":"ca1846d539ed6b9921d88caae85af0a5203bc0da8aa87a777fdba5ab71b8480f"} Dec 02 09:36:10 crc kubenswrapper[4781]: I1202 09:36:10.463783 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66d6b49755-vk46c" event={"ID":"083e89f8-c2ae-42a1-8112-38744cf35f01","Type":"ContainerStarted","Data":"3dd0e43a71818094431bc7fe9e7c746c1c86f8687b70a20703e438bd353f54ee"} Dec 02 09:36:10 crc kubenswrapper[4781]: I1202 09:36:10.463887 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gszsj" podUID="10c35de0-81dc-4710-9a2e-17cf6e37ed2f" containerName="registry-server" containerID="cri-o://db1840359698fd4d213059895ebbb0f178bd7826b57f01b5dec60289c1d4e0d4" gracePeriod=2 Dec 02 09:36:10 crc kubenswrapper[4781]: I1202 09:36:10.483269 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66d6b49755-vk46c" podStartSLOduration=2.483255515 podStartE2EDuration="2.483255515s" podCreationTimestamp="2025-12-02 09:36:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:36:10.4804482 +0000 UTC m=+933.304322099" watchObservedRunningTime="2025-12-02 09:36:10.483255515 +0000 UTC m=+933.307129394" Dec 02 09:36:12 crc kubenswrapper[4781]: I1202 09:36:12.483708 4781 generic.go:334] "Generic (PLEG): container finished" podID="10c35de0-81dc-4710-9a2e-17cf6e37ed2f" containerID="db1840359698fd4d213059895ebbb0f178bd7826b57f01b5dec60289c1d4e0d4" exitCode=0 Dec 02 09:36:12 crc kubenswrapper[4781]: I1202 09:36:12.483881 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gszsj" event={"ID":"10c35de0-81dc-4710-9a2e-17cf6e37ed2f","Type":"ContainerDied","Data":"db1840359698fd4d213059895ebbb0f178bd7826b57f01b5dec60289c1d4e0d4"} Dec 02 09:36:12 crc kubenswrapper[4781]: I1202 09:36:12.644258 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gszsj" Dec 02 09:36:12 crc kubenswrapper[4781]: I1202 09:36:12.729443 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-catalog-content\") pod \"10c35de0-81dc-4710-9a2e-17cf6e37ed2f\" (UID: \"10c35de0-81dc-4710-9a2e-17cf6e37ed2f\") " Dec 02 09:36:12 crc kubenswrapper[4781]: I1202 09:36:12.729621 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-utilities\") pod \"10c35de0-81dc-4710-9a2e-17cf6e37ed2f\" (UID: \"10c35de0-81dc-4710-9a2e-17cf6e37ed2f\") " Dec 02 09:36:12 crc kubenswrapper[4781]: I1202 09:36:12.729668 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nkn9\" (UniqueName: \"kubernetes.io/projected/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-kube-api-access-9nkn9\") pod \"10c35de0-81dc-4710-9a2e-17cf6e37ed2f\" (UID: \"10c35de0-81dc-4710-9a2e-17cf6e37ed2f\") " Dec 02 09:36:12 crc kubenswrapper[4781]: I1202 09:36:12.732119 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-utilities" (OuterVolumeSpecName: "utilities") pod "10c35de0-81dc-4710-9a2e-17cf6e37ed2f" (UID: "10c35de0-81dc-4710-9a2e-17cf6e37ed2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:36:12 crc kubenswrapper[4781]: I1202 09:36:12.734940 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-kube-api-access-9nkn9" (OuterVolumeSpecName: "kube-api-access-9nkn9") pod "10c35de0-81dc-4710-9a2e-17cf6e37ed2f" (UID: "10c35de0-81dc-4710-9a2e-17cf6e37ed2f"). InnerVolumeSpecName "kube-api-access-9nkn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:36:12 crc kubenswrapper[4781]: I1202 09:36:12.831476 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:36:12 crc kubenswrapper[4781]: I1202 09:36:12.831505 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nkn9\" (UniqueName: \"kubernetes.io/projected/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-kube-api-access-9nkn9\") on node \"crc\" DevicePath \"\"" Dec 02 09:36:12 crc kubenswrapper[4781]: I1202 09:36:12.844698 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10c35de0-81dc-4710-9a2e-17cf6e37ed2f" (UID: "10c35de0-81dc-4710-9a2e-17cf6e37ed2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:36:12 crc kubenswrapper[4781]: I1202 09:36:12.933057 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10c35de0-81dc-4710-9a2e-17cf6e37ed2f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:36:13 crc kubenswrapper[4781]: I1202 09:36:13.492762 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gszsj" event={"ID":"10c35de0-81dc-4710-9a2e-17cf6e37ed2f","Type":"ContainerDied","Data":"9cd58f6dc0a0ec470ef76dcc2c1a9a4ac07eb41e2b1a444ff62042148205c8f2"} Dec 02 09:36:13 crc kubenswrapper[4781]: I1202 09:36:13.492788 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gszsj" Dec 02 09:36:13 crc kubenswrapper[4781]: I1202 09:36:13.492815 4781 scope.go:117] "RemoveContainer" containerID="db1840359698fd4d213059895ebbb0f178bd7826b57f01b5dec60289c1d4e0d4" Dec 02 09:36:13 crc kubenswrapper[4781]: I1202 09:36:13.495466 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x" event={"ID":"9f057938-ae0d-4ce5-a920-b34905dcab9a","Type":"ContainerStarted","Data":"50cd8f184a1d92e8d22fff3a98a55477ab58fd3509e03cf53fe1536c575ee077"} Dec 02 09:36:13 crc kubenswrapper[4781]: I1202 09:36:13.515397 4781 scope.go:117] "RemoveContainer" containerID="71c673cebca85746b28a6bef11117463af7e80b18522595d105e4e895918d6b8" Dec 02 09:36:13 crc kubenswrapper[4781]: I1202 09:36:13.517375 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-dq5sv" event={"ID":"de15e7ea-d9ce-4713-bb63-87db8b3c5afd","Type":"ContainerStarted","Data":"12e18822a3caa30e914cfa8d03e2b8c7b0d4fa17b9d570c5d72004b52b013e5b"} Dec 02 09:36:13 crc kubenswrapper[4781]: I1202 09:36:13.517405 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-h2jq9" event={"ID":"acd034d2-af9c-49ff-a584-f7f0ef482c10","Type":"ContainerStarted","Data":"899cd28d4aecf5689d48e31246cd17a82221b23a1cb989738ff323077197a04f"} Dec 02 09:36:13 crc kubenswrapper[4781]: I1202 09:36:13.517419 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-dq5sv" Dec 02 09:36:13 crc kubenswrapper[4781]: I1202 09:36:13.517429 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-b4bs8" event={"ID":"71f173d9-47d0-4576-991f-0eeffb003596","Type":"ContainerStarted","Data":"f5a54f9e6adc1576886907f4a1d359252d69e62cf85316cb8b52dc613a470da2"} Dec 02 09:36:13 crc kubenswrapper[4781]: I1202 09:36:13.517441 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-h2jq9" Dec 02 09:36:13 crc kubenswrapper[4781]: I1202 09:36:13.518182 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-xqj9x" podStartSLOduration=2.565154231 podStartE2EDuration="5.518169756s" podCreationTimestamp="2025-12-02 09:36:08 +0000 UTC" firstStartedPulling="2025-12-02 09:36:09.753831606 +0000 UTC m=+932.577705485" lastFinishedPulling="2025-12-02 09:36:12.706847131 +0000 UTC m=+935.530721010" observedRunningTime="2025-12-02 09:36:13.510139849 +0000 UTC m=+936.334013738" watchObservedRunningTime="2025-12-02 09:36:13.518169756 +0000 UTC m=+936.342043645" Dec 02 09:36:13 crc kubenswrapper[4781]: I1202 09:36:13.536061 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-dq5sv" podStartSLOduration=1.9277880330000001 podStartE2EDuration="5.53603203s" podCreationTimestamp="2025-12-02 09:36:08 +0000 UTC" firstStartedPulling="2025-12-02 09:36:09.103185547 +0000 UTC m=+931.927059436" lastFinishedPulling="2025-12-02 09:36:12.711429554 +0000 UTC m=+935.535303433" observedRunningTime="2025-12-02 09:36:13.533810399 +0000 UTC m=+936.357684328" watchObservedRunningTime="2025-12-02 09:36:13.53603203 +0000 UTC m=+936.359905949" Dec 02 09:36:13 crc kubenswrapper[4781]: I1202 09:36:13.549104 4781 scope.go:117] "RemoveContainer" containerID="cfed86336a5497dd7d0f028f0e512f6c3ec9842c6467b4d3bb33d4dc2455026f" Dec 02 09:36:13 crc kubenswrapper[4781]: I1202 09:36:13.557070 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-h2jq9" podStartSLOduration=1.743994349 podStartE2EDuration="5.556973357s" podCreationTimestamp="2025-12-02 09:36:08 +0000 UTC" firstStartedPulling="2025-12-02 09:36:08.892380192 +0000 UTC m=+931.716254071" lastFinishedPulling="2025-12-02 09:36:12.70535921 +0000 UTC m=+935.529233079" observedRunningTime="2025-12-02 09:36:13.550422269 +0000 UTC m=+936.374296168" watchObservedRunningTime="2025-12-02 09:36:13.556973357 +0000 UTC m=+936.380847276" Dec 02 09:36:13 crc kubenswrapper[4781]: I1202 09:36:13.574714 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gszsj"] Dec 02 09:36:13 crc kubenswrapper[4781]: I1202 09:36:13.574766 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gszsj"] Dec 02 09:36:15 crc kubenswrapper[4781]: I1202 09:36:15.508657 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c35de0-81dc-4710-9a2e-17cf6e37ed2f" path="/var/lib/kubelet/pods/10c35de0-81dc-4710-9a2e-17cf6e37ed2f/volumes" Dec 02 09:36:16 crc kubenswrapper[4781]: I1202 09:36:16.527367 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-b4bs8" event={"ID":"71f173d9-47d0-4576-991f-0eeffb003596","Type":"ContainerStarted","Data":"b9953b0d34b36ba60dd899b6c3f4d932dd1cddd0c3198cbcddfafe7586f17938"} Dec 02 09:36:16 crc kubenswrapper[4781]: I1202 09:36:16.550408 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-b4bs8" podStartSLOduration=1.945437351 podStartE2EDuration="8.550378735s" podCreationTimestamp="2025-12-02 09:36:08 +0000 UTC" firstStartedPulling="2025-12-02 09:36:09.053972446 +0000 UTC m=+931.877846315" lastFinishedPulling="2025-12-02 09:36:15.65891383 +0000 UTC m=+938.482787699" observedRunningTime="2025-12-02 09:36:16.544242718 +0000 UTC m=+939.368116617" watchObservedRunningTime="2025-12-02 09:36:16.550378735 +0000 UTC m=+939.374252624" Dec 02 09:36:18 crc kubenswrapper[4781]: I1202 09:36:18.856651 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-h2jq9" Dec 02 09:36:19 crc kubenswrapper[4781]: I1202 09:36:19.192297 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:19 crc kubenswrapper[4781]: I1202 09:36:19.192692 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:19 crc kubenswrapper[4781]: I1202 09:36:19.197438 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:19 crc kubenswrapper[4781]: I1202 09:36:19.553630 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66d6b49755-vk46c" Dec 02 09:36:19 crc kubenswrapper[4781]: I1202 09:36:19.630400 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-krhf5"] Dec 02 09:36:28 crc kubenswrapper[4781]: I1202 09:36:28.823349 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-dq5sv" Dec 02 09:36:30 crc kubenswrapper[4781]: I1202 09:36:30.412567 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:36:30 crc kubenswrapper[4781]: I1202 09:36:30.412892 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.529249 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt"] Dec 02 09:36:42 crc kubenswrapper[4781]: E1202 09:36:42.530273 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c35de0-81dc-4710-9a2e-17cf6e37ed2f" containerName="extract-content" Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.530289 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c35de0-81dc-4710-9a2e-17cf6e37ed2f" containerName="extract-content" Dec 02 09:36:42 crc kubenswrapper[4781]: E1202 09:36:42.530311 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c35de0-81dc-4710-9a2e-17cf6e37ed2f" containerName="extract-utilities" Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.530319 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c35de0-81dc-4710-9a2e-17cf6e37ed2f" containerName="extract-utilities" Dec 02 09:36:42 crc kubenswrapper[4781]: E1202 09:36:42.530340 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c35de0-81dc-4710-9a2e-17cf6e37ed2f" containerName="registry-server" Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.530348 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c35de0-81dc-4710-9a2e-17cf6e37ed2f" containerName="registry-server" Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.530457 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c35de0-81dc-4710-9a2e-17cf6e37ed2f" containerName="registry-server" Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.531501 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.533773 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.541111 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt"] Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.665407 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a30d0113-e2bc-4a14-b7b0-49363876c6b2-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt\" (UID: \"a30d0113-e2bc-4a14-b7b0-49363876c6b2\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.665483 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvtdx\" (UniqueName: \"kubernetes.io/projected/a30d0113-e2bc-4a14-b7b0-49363876c6b2-kube-api-access-nvtdx\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt\" (UID: \"a30d0113-e2bc-4a14-b7b0-49363876c6b2\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.665524 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a30d0113-e2bc-4a14-b7b0-49363876c6b2-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt\" (UID: \"a30d0113-e2bc-4a14-b7b0-49363876c6b2\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.766876 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a30d0113-e2bc-4a14-b7b0-49363876c6b2-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt\" (UID: \"a30d0113-e2bc-4a14-b7b0-49363876c6b2\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.767267 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvtdx\" (UniqueName: \"kubernetes.io/projected/a30d0113-e2bc-4a14-b7b0-49363876c6b2-kube-api-access-nvtdx\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt\" (UID: \"a30d0113-e2bc-4a14-b7b0-49363876c6b2\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.767312 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a30d0113-e2bc-4a14-b7b0-49363876c6b2-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt\" (UID: \"a30d0113-e2bc-4a14-b7b0-49363876c6b2\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.767349 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a30d0113-e2bc-4a14-b7b0-49363876c6b2-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt\" (UID: \"a30d0113-e2bc-4a14-b7b0-49363876c6b2\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.768044 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a30d0113-e2bc-4a14-b7b0-49363876c6b2-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt\" (UID: \"a30d0113-e2bc-4a14-b7b0-49363876c6b2\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.788754 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvtdx\" (UniqueName: \"kubernetes.io/projected/a30d0113-e2bc-4a14-b7b0-49363876c6b2-kube-api-access-nvtdx\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt\" (UID: \"a30d0113-e2bc-4a14-b7b0-49363876c6b2\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" Dec 02 09:36:42 crc kubenswrapper[4781]: I1202 09:36:42.850078 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" Dec 02 09:36:43 crc kubenswrapper[4781]: I1202 09:36:43.067462 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt"] Dec 02 09:36:43 crc kubenswrapper[4781]: I1202 09:36:43.735064 4781 generic.go:334] "Generic (PLEG): container finished" podID="a30d0113-e2bc-4a14-b7b0-49363876c6b2" containerID="3c8462a0559cd312fb5653545f762721beaeb3391b9de8c8b205e913bd13029c" exitCode=0 Dec 02 09:36:43 crc kubenswrapper[4781]: I1202 09:36:43.735193 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" event={"ID":"a30d0113-e2bc-4a14-b7b0-49363876c6b2","Type":"ContainerDied","Data":"3c8462a0559cd312fb5653545f762721beaeb3391b9de8c8b205e913bd13029c"} Dec 02 09:36:43 crc kubenswrapper[4781]: I1202 09:36:43.735420 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" event={"ID":"a30d0113-e2bc-4a14-b7b0-49363876c6b2","Type":"ContainerStarted","Data":"234e11dd19e7a8177e6a0123eef0382bbdf436a4842603b0eefbb683dd72da40"} Dec 02 09:36:44 crc kubenswrapper[4781]: I1202 09:36:44.684245 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-krhf5" podUID="f87709bf-590d-4318-abdb-7ecb8a86e303" containerName="console" containerID="cri-o://89cd9afe5ba4030ad382157d1ed2db54bcc27a0cc186aadbef63b7f471db47b1" gracePeriod=15 Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.554897 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-krhf5_f87709bf-590d-4318-abdb-7ecb8a86e303/console/0.log" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.555291 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.612513 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-trusted-ca-bundle\") pod \"f87709bf-590d-4318-abdb-7ecb8a86e303\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.612567 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-service-ca\") pod \"f87709bf-590d-4318-abdb-7ecb8a86e303\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.612616 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f87709bf-590d-4318-abdb-7ecb8a86e303-console-oauth-config\") pod \"f87709bf-590d-4318-abdb-7ecb8a86e303\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.612691 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f87709bf-590d-4318-abdb-7ecb8a86e303-console-serving-cert\") pod \"f87709bf-590d-4318-abdb-7ecb8a86e303\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.612723 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-oauth-serving-cert\") pod \"f87709bf-590d-4318-abdb-7ecb8a86e303\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.612746 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-console-config\") pod \"f87709bf-590d-4318-abdb-7ecb8a86e303\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.612822 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvnlc\" (UniqueName: \"kubernetes.io/projected/f87709bf-590d-4318-abdb-7ecb8a86e303-kube-api-access-hvnlc\") pod \"f87709bf-590d-4318-abdb-7ecb8a86e303\" (UID: \"f87709bf-590d-4318-abdb-7ecb8a86e303\") " Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.613594 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f87709bf-590d-4318-abdb-7ecb8a86e303" (UID: "f87709bf-590d-4318-abdb-7ecb8a86e303"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.613672 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-service-ca" (OuterVolumeSpecName: "service-ca") pod "f87709bf-590d-4318-abdb-7ecb8a86e303" (UID: "f87709bf-590d-4318-abdb-7ecb8a86e303"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.613700 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-console-config" (OuterVolumeSpecName: "console-config") pod "f87709bf-590d-4318-abdb-7ecb8a86e303" (UID: "f87709bf-590d-4318-abdb-7ecb8a86e303"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.614220 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f87709bf-590d-4318-abdb-7ecb8a86e303" (UID: "f87709bf-590d-4318-abdb-7ecb8a86e303"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.618520 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f87709bf-590d-4318-abdb-7ecb8a86e303-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f87709bf-590d-4318-abdb-7ecb8a86e303" (UID: "f87709bf-590d-4318-abdb-7ecb8a86e303"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.618547 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f87709bf-590d-4318-abdb-7ecb8a86e303-kube-api-access-hvnlc" (OuterVolumeSpecName: "kube-api-access-hvnlc") pod "f87709bf-590d-4318-abdb-7ecb8a86e303" (UID: "f87709bf-590d-4318-abdb-7ecb8a86e303"). InnerVolumeSpecName "kube-api-access-hvnlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.619194 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f87709bf-590d-4318-abdb-7ecb8a86e303-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f87709bf-590d-4318-abdb-7ecb8a86e303" (UID: "f87709bf-590d-4318-abdb-7ecb8a86e303"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.713836 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvnlc\" (UniqueName: \"kubernetes.io/projected/f87709bf-590d-4318-abdb-7ecb8a86e303-kube-api-access-hvnlc\") on node \"crc\" DevicePath \"\"" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.713881 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.713890 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.713899 4781 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f87709bf-590d-4318-abdb-7ecb8a86e303-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.713907 4781 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f87709bf-590d-4318-abdb-7ecb8a86e303-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.713915 4781 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.713941 4781 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f87709bf-590d-4318-abdb-7ecb8a86e303-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.748910 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-krhf5_f87709bf-590d-4318-abdb-7ecb8a86e303/console/0.log" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.748993 4781 generic.go:334] "Generic (PLEG): container finished" podID="f87709bf-590d-4318-abdb-7ecb8a86e303" containerID="89cd9afe5ba4030ad382157d1ed2db54bcc27a0cc186aadbef63b7f471db47b1" exitCode=2 Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.749031 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-krhf5" event={"ID":"f87709bf-590d-4318-abdb-7ecb8a86e303","Type":"ContainerDied","Data":"89cd9afe5ba4030ad382157d1ed2db54bcc27a0cc186aadbef63b7f471db47b1"} Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.749065 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-krhf5" event={"ID":"f87709bf-590d-4318-abdb-7ecb8a86e303","Type":"ContainerDied","Data":"2f59cb79d7444fe7bdcf62475d54e7a5fec4e3054a42f8373566ab4764a318ce"} Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.749089 4781 scope.go:117] "RemoveContainer" containerID="89cd9afe5ba4030ad382157d1ed2db54bcc27a0cc186aadbef63b7f471db47b1" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.749127 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-krhf5" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.767987 4781 scope.go:117] "RemoveContainer" containerID="89cd9afe5ba4030ad382157d1ed2db54bcc27a0cc186aadbef63b7f471db47b1" Dec 02 09:36:45 crc kubenswrapper[4781]: E1202 09:36:45.768625 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89cd9afe5ba4030ad382157d1ed2db54bcc27a0cc186aadbef63b7f471db47b1\": container with ID starting with 89cd9afe5ba4030ad382157d1ed2db54bcc27a0cc186aadbef63b7f471db47b1 not found: ID does not exist" containerID="89cd9afe5ba4030ad382157d1ed2db54bcc27a0cc186aadbef63b7f471db47b1" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.768668 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89cd9afe5ba4030ad382157d1ed2db54bcc27a0cc186aadbef63b7f471db47b1"} err="failed to get container status \"89cd9afe5ba4030ad382157d1ed2db54bcc27a0cc186aadbef63b7f471db47b1\": rpc error: code = NotFound desc = could not find container \"89cd9afe5ba4030ad382157d1ed2db54bcc27a0cc186aadbef63b7f471db47b1\": container with ID starting with 89cd9afe5ba4030ad382157d1ed2db54bcc27a0cc186aadbef63b7f471db47b1 not found: ID does not exist" Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.798694 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-krhf5"] Dec 02 09:36:45 crc kubenswrapper[4781]: I1202 09:36:45.812013 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-krhf5"] Dec 02 09:36:46 crc kubenswrapper[4781]: I1202 09:36:46.761144 4781 generic.go:334] "Generic (PLEG): container finished" podID="a30d0113-e2bc-4a14-b7b0-49363876c6b2" containerID="f0ac60a813b094846572270d252fb52c6ddb62f95b0d947764da098f51a74195" exitCode=0 Dec 02 09:36:46 crc kubenswrapper[4781]: I1202 09:36:46.761208 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" event={"ID":"a30d0113-e2bc-4a14-b7b0-49363876c6b2","Type":"ContainerDied","Data":"f0ac60a813b094846572270d252fb52c6ddb62f95b0d947764da098f51a74195"} Dec 02 09:36:47 crc kubenswrapper[4781]: I1202 09:36:47.510892 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f87709bf-590d-4318-abdb-7ecb8a86e303" path="/var/lib/kubelet/pods/f87709bf-590d-4318-abdb-7ecb8a86e303/volumes" Dec 02 09:36:51 crc kubenswrapper[4781]: I1202 09:36:51.809986 4781 generic.go:334] "Generic (PLEG): container finished" podID="a30d0113-e2bc-4a14-b7b0-49363876c6b2" containerID="f50a62c4f9d4c33f4696ebc4ebccae97099a06037380bee165a5f57507f966ee" exitCode=0 Dec 02 09:36:51 crc kubenswrapper[4781]: I1202 09:36:51.810062 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" event={"ID":"a30d0113-e2bc-4a14-b7b0-49363876c6b2","Type":"ContainerDied","Data":"f50a62c4f9d4c33f4696ebc4ebccae97099a06037380bee165a5f57507f966ee"} Dec 02 09:36:53 crc kubenswrapper[4781]: I1202 09:36:53.064049 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" Dec 02 09:36:53 crc kubenswrapper[4781]: I1202 09:36:53.107378 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvtdx\" (UniqueName: \"kubernetes.io/projected/a30d0113-e2bc-4a14-b7b0-49363876c6b2-kube-api-access-nvtdx\") pod \"a30d0113-e2bc-4a14-b7b0-49363876c6b2\" (UID: \"a30d0113-e2bc-4a14-b7b0-49363876c6b2\") " Dec 02 09:36:53 crc kubenswrapper[4781]: I1202 09:36:53.107515 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a30d0113-e2bc-4a14-b7b0-49363876c6b2-util\") pod \"a30d0113-e2bc-4a14-b7b0-49363876c6b2\" (UID: \"a30d0113-e2bc-4a14-b7b0-49363876c6b2\") " Dec 02 09:36:53 crc kubenswrapper[4781]: I1202 09:36:53.107569 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a30d0113-e2bc-4a14-b7b0-49363876c6b2-bundle\") pod \"a30d0113-e2bc-4a14-b7b0-49363876c6b2\" (UID: \"a30d0113-e2bc-4a14-b7b0-49363876c6b2\") " Dec 02 09:36:53 crc kubenswrapper[4781]: I1202 09:36:53.108628 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a30d0113-e2bc-4a14-b7b0-49363876c6b2-bundle" (OuterVolumeSpecName: "bundle") pod "a30d0113-e2bc-4a14-b7b0-49363876c6b2" (UID: "a30d0113-e2bc-4a14-b7b0-49363876c6b2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:36:53 crc kubenswrapper[4781]: I1202 09:36:53.116557 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a30d0113-e2bc-4a14-b7b0-49363876c6b2-kube-api-access-nvtdx" (OuterVolumeSpecName: "kube-api-access-nvtdx") pod "a30d0113-e2bc-4a14-b7b0-49363876c6b2" (UID: "a30d0113-e2bc-4a14-b7b0-49363876c6b2"). InnerVolumeSpecName "kube-api-access-nvtdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:36:53 crc kubenswrapper[4781]: I1202 09:36:53.118639 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a30d0113-e2bc-4a14-b7b0-49363876c6b2-util" (OuterVolumeSpecName: "util") pod "a30d0113-e2bc-4a14-b7b0-49363876c6b2" (UID: "a30d0113-e2bc-4a14-b7b0-49363876c6b2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:36:53 crc kubenswrapper[4781]: I1202 09:36:53.211176 4781 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a30d0113-e2bc-4a14-b7b0-49363876c6b2-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:36:53 crc kubenswrapper[4781]: I1202 09:36:53.211280 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvtdx\" (UniqueName: \"kubernetes.io/projected/a30d0113-e2bc-4a14-b7b0-49363876c6b2-kube-api-access-nvtdx\") on node \"crc\" DevicePath \"\"" Dec 02 09:36:53 crc kubenswrapper[4781]: I1202 09:36:53.211295 4781 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a30d0113-e2bc-4a14-b7b0-49363876c6b2-util\") on node \"crc\" DevicePath \"\"" Dec 02 09:36:53 crc kubenswrapper[4781]: I1202 09:36:53.823979 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" event={"ID":"a30d0113-e2bc-4a14-b7b0-49363876c6b2","Type":"ContainerDied","Data":"234e11dd19e7a8177e6a0123eef0382bbdf436a4842603b0eefbb683dd72da40"} Dec 02 09:36:53 crc kubenswrapper[4781]: I1202 09:36:53.824043 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="234e11dd19e7a8177e6a0123eef0382bbdf436a4842603b0eefbb683dd72da40" Dec 02 09:36:53 crc kubenswrapper[4781]: I1202 09:36:53.824017 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt" Dec 02 09:37:00 crc kubenswrapper[4781]: I1202 09:37:00.412313 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:37:00 crc kubenswrapper[4781]: I1202 09:37:00.412865 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.304573 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-89jz4"] Dec 02 09:37:03 crc kubenswrapper[4781]: E1202 09:37:03.305049 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f87709bf-590d-4318-abdb-7ecb8a86e303" containerName="console" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.305063 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f87709bf-590d-4318-abdb-7ecb8a86e303" containerName="console" Dec 02 09:37:03 crc kubenswrapper[4781]: E1202 09:37:03.305077 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30d0113-e2bc-4a14-b7b0-49363876c6b2" containerName="util" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.305084 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30d0113-e2bc-4a14-b7b0-49363876c6b2" containerName="util" Dec 02 09:37:03 crc kubenswrapper[4781]: E1202 09:37:03.305101 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30d0113-e2bc-4a14-b7b0-49363876c6b2" containerName="extract" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.305114 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30d0113-e2bc-4a14-b7b0-49363876c6b2" containerName="extract" Dec 02 09:37:03 crc kubenswrapper[4781]: E1202 09:37:03.305129 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30d0113-e2bc-4a14-b7b0-49363876c6b2" containerName="pull" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.305137 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30d0113-e2bc-4a14-b7b0-49363876c6b2" containerName="pull" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.305240 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30d0113-e2bc-4a14-b7b0-49363876c6b2" containerName="extract" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.305253 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f87709bf-590d-4318-abdb-7ecb8a86e303" containerName="console" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.306025 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-89jz4" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.316211 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-89jz4"] Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.431223 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbtfr\" (UniqueName: \"kubernetes.io/projected/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-kube-api-access-dbtfr\") pod \"certified-operators-89jz4\" (UID: \"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d\") " pod="openshift-marketplace/certified-operators-89jz4" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.431299 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-utilities\") pod \"certified-operators-89jz4\" (UID: \"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d\") " pod="openshift-marketplace/certified-operators-89jz4" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.431335 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-catalog-content\") pod \"certified-operators-89jz4\" (UID: \"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d\") " pod="openshift-marketplace/certified-operators-89jz4" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.532401 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-utilities\") pod \"certified-operators-89jz4\" (UID: \"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d\") " pod="openshift-marketplace/certified-operators-89jz4" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.532461 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-catalog-content\") pod \"certified-operators-89jz4\" (UID: \"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d\") " pod="openshift-marketplace/certified-operators-89jz4" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.532556 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbtfr\" (UniqueName: \"kubernetes.io/projected/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-kube-api-access-dbtfr\") pod \"certified-operators-89jz4\" (UID: \"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d\") " pod="openshift-marketplace/certified-operators-89jz4" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.533015 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-utilities\") pod \"certified-operators-89jz4\" (UID: \"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d\") " pod="openshift-marketplace/certified-operators-89jz4" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.533047 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-catalog-content\") pod \"certified-operators-89jz4\" (UID: \"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d\") " pod="openshift-marketplace/certified-operators-89jz4" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.551638 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbtfr\" (UniqueName: \"kubernetes.io/projected/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-kube-api-access-dbtfr\") pod \"certified-operators-89jz4\" (UID: \"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d\") " pod="openshift-marketplace/certified-operators-89jz4" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.623486 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-89jz4" Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.836138 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-89jz4"] Dec 02 09:37:03 crc kubenswrapper[4781]: I1202 09:37:03.897013 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-89jz4" event={"ID":"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d","Type":"ContainerStarted","Data":"34e2f1657edda1e61676e32091392be88315b8d6a132bb0e9dff9d88b696ed21"} Dec 02 09:37:04 crc kubenswrapper[4781]: I1202 09:37:04.904075 4781 generic.go:334] "Generic (PLEG): container finished" podID="4ca549c4-0dcd-4b93-874b-f49ba5c9df2d" containerID="42a69885bbe011f24183246f8b32305977669b56c24e7cd98c478baf7772316a" exitCode=0 Dec 02 09:37:04 crc kubenswrapper[4781]: I1202 09:37:04.904126 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-89jz4" event={"ID":"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d","Type":"ContainerDied","Data":"42a69885bbe011f24183246f8b32305977669b56c24e7cd98c478baf7772316a"} Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.660140 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd"] Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.661279 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd" Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.663524 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-r4nrr" Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.663739 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.664111 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.666667 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.667341 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.685608 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd"] Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.862243 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/60dee968-291d-4e9d-b2a5-40b67457b003-apiservice-cert\") pod \"metallb-operator-controller-manager-75f8895565-ttkrd\" (UID: \"60dee968-291d-4e9d-b2a5-40b67457b003\") " pod="metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd" Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.862314 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/60dee968-291d-4e9d-b2a5-40b67457b003-webhook-cert\") pod \"metallb-operator-controller-manager-75f8895565-ttkrd\" (UID: \"60dee968-291d-4e9d-b2a5-40b67457b003\") " pod="metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd" Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.862342 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75rsb\" (UniqueName: \"kubernetes.io/projected/60dee968-291d-4e9d-b2a5-40b67457b003-kube-api-access-75rsb\") pod \"metallb-operator-controller-manager-75f8895565-ttkrd\" (UID: \"60dee968-291d-4e9d-b2a5-40b67457b003\") " pod="metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd" Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.963087 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/60dee968-291d-4e9d-b2a5-40b67457b003-apiservice-cert\") pod \"metallb-operator-controller-manager-75f8895565-ttkrd\" (UID: \"60dee968-291d-4e9d-b2a5-40b67457b003\") " pod="metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd" Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.963187 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/60dee968-291d-4e9d-b2a5-40b67457b003-webhook-cert\") pod \"metallb-operator-controller-manager-75f8895565-ttkrd\" (UID: \"60dee968-291d-4e9d-b2a5-40b67457b003\") " pod="metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd" Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.963250 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75rsb\" (UniqueName: \"kubernetes.io/projected/60dee968-291d-4e9d-b2a5-40b67457b003-kube-api-access-75rsb\") pod \"metallb-operator-controller-manager-75f8895565-ttkrd\" (UID: \"60dee968-291d-4e9d-b2a5-40b67457b003\") " pod="metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd" Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.970270 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/60dee968-291d-4e9d-b2a5-40b67457b003-webhook-cert\") pod \"metallb-operator-controller-manager-75f8895565-ttkrd\" (UID: \"60dee968-291d-4e9d-b2a5-40b67457b003\") " pod="metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd" Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.980779 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/60dee968-291d-4e9d-b2a5-40b67457b003-apiservice-cert\") pod \"metallb-operator-controller-manager-75f8895565-ttkrd\" (UID: \"60dee968-291d-4e9d-b2a5-40b67457b003\") " pod="metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd" Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.984564 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk"] Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.985224 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk" Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.988570 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.989173 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 02 09:37:05 crc kubenswrapper[4781]: I1202 09:37:05.989440 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-xfwck" Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.001628 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75rsb\" (UniqueName: \"kubernetes.io/projected/60dee968-291d-4e9d-b2a5-40b67457b003-kube-api-access-75rsb\") pod \"metallb-operator-controller-manager-75f8895565-ttkrd\" (UID: \"60dee968-291d-4e9d-b2a5-40b67457b003\") " pod="metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd" Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.006989 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk"] Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.064302 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/804d6dfe-f063-4edb-b276-9a386bae049a-webhook-cert\") pod \"metallb-operator-webhook-server-6f95b97b7b-7p5dk\" (UID: \"804d6dfe-f063-4edb-b276-9a386bae049a\") " pod="metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk" Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.064452 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/804d6dfe-f063-4edb-b276-9a386bae049a-apiservice-cert\") pod \"metallb-operator-webhook-server-6f95b97b7b-7p5dk\" (UID: \"804d6dfe-f063-4edb-b276-9a386bae049a\") " pod="metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk" Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.064495 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pct6v\" (UniqueName: \"kubernetes.io/projected/804d6dfe-f063-4edb-b276-9a386bae049a-kube-api-access-pct6v\") pod \"metallb-operator-webhook-server-6f95b97b7b-7p5dk\" (UID: \"804d6dfe-f063-4edb-b276-9a386bae049a\") " pod="metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk" Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.165516 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/804d6dfe-f063-4edb-b276-9a386bae049a-webhook-cert\") pod \"metallb-operator-webhook-server-6f95b97b7b-7p5dk\" (UID: \"804d6dfe-f063-4edb-b276-9a386bae049a\") " pod="metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk" Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.165820 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/804d6dfe-f063-4edb-b276-9a386bae049a-apiservice-cert\") pod \"metallb-operator-webhook-server-6f95b97b7b-7p5dk\" (UID: \"804d6dfe-f063-4edb-b276-9a386bae049a\") " pod="metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk" Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.165862 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pct6v\" (UniqueName: \"kubernetes.io/projected/804d6dfe-f063-4edb-b276-9a386bae049a-kube-api-access-pct6v\") pod \"metallb-operator-webhook-server-6f95b97b7b-7p5dk\" (UID: \"804d6dfe-f063-4edb-b276-9a386bae049a\") " pod="metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk" Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.170742 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/804d6dfe-f063-4edb-b276-9a386bae049a-webhook-cert\") pod \"metallb-operator-webhook-server-6f95b97b7b-7p5dk\" (UID: \"804d6dfe-f063-4edb-b276-9a386bae049a\") " pod="metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk" Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.171536 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/804d6dfe-f063-4edb-b276-9a386bae049a-apiservice-cert\") pod \"metallb-operator-webhook-server-6f95b97b7b-7p5dk\" (UID: \"804d6dfe-f063-4edb-b276-9a386bae049a\") " pod="metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk" Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.217850 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pct6v\" (UniqueName: \"kubernetes.io/projected/804d6dfe-f063-4edb-b276-9a386bae049a-kube-api-access-pct6v\") pod \"metallb-operator-webhook-server-6f95b97b7b-7p5dk\" (UID: \"804d6dfe-f063-4edb-b276-9a386bae049a\") " pod="metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk" Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.279570 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd" Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.367081 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk" Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.638416 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk"] Dec 02 09:37:06 crc kubenswrapper[4781]: W1202 09:37:06.646051 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod804d6dfe_f063_4edb_b276_9a386bae049a.slice/crio-7b7a69d159c4ea088b3b7fdf7f4c94b1bdda3bb11dfe04bf42687a91579ee6be WatchSource:0}: Error finding container 7b7a69d159c4ea088b3b7fdf7f4c94b1bdda3bb11dfe04bf42687a91579ee6be: Status 404 returned error can't find the container with id 7b7a69d159c4ea088b3b7fdf7f4c94b1bdda3bb11dfe04bf42687a91579ee6be Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.709539 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd"] Dec 02 09:37:06 crc kubenswrapper[4781]: W1202 09:37:06.709709 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60dee968_291d_4e9d_b2a5_40b67457b003.slice/crio-b5a61c06cfdba36dc4f73b003b0427cf64b103a3c88e4b7677423200279c557e WatchSource:0}: Error finding container b5a61c06cfdba36dc4f73b003b0427cf64b103a3c88e4b7677423200279c557e: Status 404 returned error can't find the container with id b5a61c06cfdba36dc4f73b003b0427cf64b103a3c88e4b7677423200279c557e Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.916052 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd" event={"ID":"60dee968-291d-4e9d-b2a5-40b67457b003","Type":"ContainerStarted","Data":"b5a61c06cfdba36dc4f73b003b0427cf64b103a3c88e4b7677423200279c557e"} Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.917732 4781 generic.go:334] "Generic (PLEG): container finished" podID="4ca549c4-0dcd-4b93-874b-f49ba5c9df2d" containerID="cfa091c4cab2c3885b624da9d68b46c05561f2f0bb51697b3abd7ec9ec8de029" exitCode=0 Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.917773 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-89jz4" event={"ID":"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d","Type":"ContainerDied","Data":"cfa091c4cab2c3885b624da9d68b46c05561f2f0bb51697b3abd7ec9ec8de029"} Dec 02 09:37:06 crc kubenswrapper[4781]: I1202 09:37:06.920521 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk" event={"ID":"804d6dfe-f063-4edb-b276-9a386bae049a","Type":"ContainerStarted","Data":"7b7a69d159c4ea088b3b7fdf7f4c94b1bdda3bb11dfe04bf42687a91579ee6be"} Dec 02 09:37:10 crc kubenswrapper[4781]: I1202 09:37:10.951230 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-89jz4" event={"ID":"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d","Type":"ContainerStarted","Data":"ad8bc02b9256cdc3334914d740bb404f4e22654ea265765d10b6fa4efe4b3935"} Dec 02 09:37:10 crc kubenswrapper[4781]: I1202 09:37:10.972889 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-89jz4" podStartSLOduration=2.637452038 podStartE2EDuration="7.97287085s" podCreationTimestamp="2025-12-02 09:37:03 +0000 UTC" firstStartedPulling="2025-12-02 09:37:04.905846663 +0000 UTC m=+987.729720542" lastFinishedPulling="2025-12-02 09:37:10.241265475 +0000 UTC m=+993.065139354" observedRunningTime="2025-12-02 09:37:10.968501993 +0000 UTC m=+993.792375892" watchObservedRunningTime="2025-12-02 09:37:10.97287085 +0000 UTC m=+993.796744729" Dec 02 09:37:13 crc kubenswrapper[4781]: I1202 09:37:13.624168 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-89jz4" Dec 02 09:37:13 crc kubenswrapper[4781]: I1202 09:37:13.624532 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-89jz4" Dec 02 09:37:13 crc kubenswrapper[4781]: I1202 09:37:13.687993 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-89jz4" Dec 02 09:37:23 crc kubenswrapper[4781]: I1202 09:37:23.671039 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-89jz4" Dec 02 09:37:25 crc kubenswrapper[4781]: I1202 09:37:25.233654 4781 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-8nxjr container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 09:37:25 crc kubenswrapper[4781]: I1202 09:37:25.234117 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8nxjr" podUID="c0570ff6-6102-40ae-a68a-b35b77756097" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 09:37:25 crc kubenswrapper[4781]: I1202 09:37:25.971687 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-89jz4"] Dec 02 09:37:25 crc kubenswrapper[4781]: I1202 09:37:25.972205 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-89jz4" podUID="4ca549c4-0dcd-4b93-874b-f49ba5c9df2d" containerName="registry-server" containerID="cri-o://ad8bc02b9256cdc3334914d740bb404f4e22654ea265765d10b6fa4efe4b3935" gracePeriod=2 Dec 02 09:37:27 crc kubenswrapper[4781]: I1202 09:37:27.035988 4781 generic.go:334] "Generic (PLEG): container finished" podID="4ca549c4-0dcd-4b93-874b-f49ba5c9df2d" containerID="ad8bc02b9256cdc3334914d740bb404f4e22654ea265765d10b6fa4efe4b3935" exitCode=0 Dec 02 09:37:27 crc kubenswrapper[4781]: I1202 09:37:27.036029 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-89jz4" event={"ID":"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d","Type":"ContainerDied","Data":"ad8bc02b9256cdc3334914d740bb404f4e22654ea265765d10b6fa4efe4b3935"} Dec 02 09:37:27 crc kubenswrapper[4781]: I1202 09:37:27.327566 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-89jz4" Dec 02 09:37:27 crc kubenswrapper[4781]: I1202 09:37:27.438033 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbtfr\" (UniqueName: \"kubernetes.io/projected/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-kube-api-access-dbtfr\") pod \"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d\" (UID: \"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d\") " Dec 02 09:37:27 crc kubenswrapper[4781]: I1202 09:37:27.438102 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-utilities\") pod \"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d\" (UID: \"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d\") " Dec 02 09:37:27 crc kubenswrapper[4781]: I1202 09:37:27.438135 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-catalog-content\") pod \"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d\" (UID: \"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d\") " Dec 02 09:37:27 crc kubenswrapper[4781]: I1202 09:37:27.438975 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-utilities" (OuterVolumeSpecName: "utilities") pod "4ca549c4-0dcd-4b93-874b-f49ba5c9df2d" (UID: "4ca549c4-0dcd-4b93-874b-f49ba5c9df2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:37:27 crc kubenswrapper[4781]: I1202 09:37:27.443074 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-kube-api-access-dbtfr" (OuterVolumeSpecName: "kube-api-access-dbtfr") pod "4ca549c4-0dcd-4b93-874b-f49ba5c9df2d" (UID: "4ca549c4-0dcd-4b93-874b-f49ba5c9df2d"). InnerVolumeSpecName "kube-api-access-dbtfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:37:27 crc kubenswrapper[4781]: I1202 09:37:27.481538 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ca549c4-0dcd-4b93-874b-f49ba5c9df2d" (UID: "4ca549c4-0dcd-4b93-874b-f49ba5c9df2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:37:27 crc kubenswrapper[4781]: I1202 09:37:27.539096 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbtfr\" (UniqueName: \"kubernetes.io/projected/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-kube-api-access-dbtfr\") on node \"crc\" DevicePath \"\"" Dec 02 09:37:27 crc kubenswrapper[4781]: I1202 09:37:27.539140 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:37:27 crc kubenswrapper[4781]: I1202 09:37:27.539173 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:37:28 crc kubenswrapper[4781]: I1202 09:37:28.044499 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-89jz4" event={"ID":"4ca549c4-0dcd-4b93-874b-f49ba5c9df2d","Type":"ContainerDied","Data":"34e2f1657edda1e61676e32091392be88315b8d6a132bb0e9dff9d88b696ed21"} Dec 02 09:37:28 crc kubenswrapper[4781]: I1202 09:37:28.044555 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-89jz4" Dec 02 09:37:28 crc kubenswrapper[4781]: I1202 09:37:28.044567 4781 scope.go:117] "RemoveContainer" containerID="ad8bc02b9256cdc3334914d740bb404f4e22654ea265765d10b6fa4efe4b3935" Dec 02 09:37:28 crc kubenswrapper[4781]: I1202 09:37:28.062086 4781 scope.go:117] "RemoveContainer" containerID="cfa091c4cab2c3885b624da9d68b46c05561f2f0bb51697b3abd7ec9ec8de029" Dec 02 09:37:28 crc kubenswrapper[4781]: I1202 09:37:28.068861 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-89jz4"] Dec 02 09:37:28 crc kubenswrapper[4781]: I1202 09:37:28.073134 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-89jz4"] Dec 02 09:37:28 crc kubenswrapper[4781]: I1202 09:37:28.075918 4781 scope.go:117] "RemoveContainer" containerID="42a69885bbe011f24183246f8b32305977669b56c24e7cd98c478baf7772316a" Dec 02 09:37:29 crc kubenswrapper[4781]: I1202 09:37:29.052806 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd" event={"ID":"60dee968-291d-4e9d-b2a5-40b67457b003","Type":"ContainerStarted","Data":"d52f28f2cf78f165bd6523a29254e04bbd89b0552478a4d8176e871f6215c502"} Dec 02 09:37:29 crc kubenswrapper[4781]: I1202 09:37:29.052869 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd" Dec 02 09:37:29 crc kubenswrapper[4781]: I1202 09:37:29.056304 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk" event={"ID":"804d6dfe-f063-4edb-b276-9a386bae049a","Type":"ContainerStarted","Data":"dfbc50d3b62736ddc61500c1cbbc68d7c8503fc829aab971a0ccccffe05eb8b7"} Dec 02 09:37:29 crc kubenswrapper[4781]: I1202 09:37:29.056427 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk" Dec 02 09:37:29 crc kubenswrapper[4781]: I1202 09:37:29.074379 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd" podStartSLOduration=3.690408606 podStartE2EDuration="24.074361565s" podCreationTimestamp="2025-12-02 09:37:05 +0000 UTC" firstStartedPulling="2025-12-02 09:37:06.712775641 +0000 UTC m=+989.536649520" lastFinishedPulling="2025-12-02 09:37:27.0967286 +0000 UTC m=+1009.920602479" observedRunningTime="2025-12-02 09:37:29.070523203 +0000 UTC m=+1011.894397082" watchObservedRunningTime="2025-12-02 09:37:29.074361565 +0000 UTC m=+1011.898235444" Dec 02 09:37:29 crc kubenswrapper[4781]: I1202 09:37:29.097572 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk" podStartSLOduration=3.6360425320000003 podStartE2EDuration="24.097550026s" podCreationTimestamp="2025-12-02 09:37:05 +0000 UTC" firstStartedPulling="2025-12-02 09:37:06.650443633 +0000 UTC m=+989.474317512" lastFinishedPulling="2025-12-02 09:37:27.111951127 +0000 UTC m=+1009.935825006" observedRunningTime="2025-12-02 09:37:29.094014371 +0000 UTC m=+1011.917888270" watchObservedRunningTime="2025-12-02 09:37:29.097550026 +0000 UTC m=+1011.921423905" Dec 02 09:37:29 crc kubenswrapper[4781]: I1202 09:37:29.505959 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ca549c4-0dcd-4b93-874b-f49ba5c9df2d" path="/var/lib/kubelet/pods/4ca549c4-0dcd-4b93-874b-f49ba5c9df2d/volumes" Dec 02 09:37:30 crc kubenswrapper[4781]: I1202 09:37:30.412443 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:37:30 crc kubenswrapper[4781]: I1202 09:37:30.412518 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:37:30 crc kubenswrapper[4781]: I1202 09:37:30.412580 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:37:30 crc kubenswrapper[4781]: I1202 09:37:30.413353 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d8ba3ba707695a318e7a4765e850fbeffac37a0664ff39d2592ecb9000863ef"} pod="openshift-machine-config-operator/machine-config-daemon-pzntm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:37:30 crc kubenswrapper[4781]: I1202 09:37:30.413439 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" containerID="cri-o://4d8ba3ba707695a318e7a4765e850fbeffac37a0664ff39d2592ecb9000863ef" gracePeriod=600 Dec 02 09:37:31 crc kubenswrapper[4781]: I1202 09:37:31.071298 4781 generic.go:334] "Generic (PLEG): container finished" podID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerID="4d8ba3ba707695a318e7a4765e850fbeffac37a0664ff39d2592ecb9000863ef" exitCode=0 Dec 02 09:37:31 crc kubenswrapper[4781]: I1202 09:37:31.071362 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerDied","Data":"4d8ba3ba707695a318e7a4765e850fbeffac37a0664ff39d2592ecb9000863ef"} Dec 02 09:37:31 crc kubenswrapper[4781]: I1202 09:37:31.071574 4781 scope.go:117] "RemoveContainer" containerID="a295986289f2c69b34cac9164fd7741bae073fa0ffe0145fa34c6c835c2b0b11" Dec 02 09:37:32 crc kubenswrapper[4781]: I1202 09:37:32.079554 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"065de47b4d4ea37363c4ad12a4f2a35f47f5c31ffb6b9b5188b40ed91f236ce5"} Dec 02 09:37:46 crc kubenswrapper[4781]: I1202 09:37:46.375523 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6f95b97b7b-7p5dk" Dec 02 09:38:06 crc kubenswrapper[4781]: I1202 09:38:06.283023 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-75f8895565-ttkrd" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.165961 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-mfpkp"] Dec 02 09:38:07 crc kubenswrapper[4781]: E1202 09:38:07.166554 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca549c4-0dcd-4b93-874b-f49ba5c9df2d" containerName="extract-content" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.166575 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca549c4-0dcd-4b93-874b-f49ba5c9df2d" containerName="extract-content" Dec 02 09:38:07 crc kubenswrapper[4781]: E1202 09:38:07.166592 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca549c4-0dcd-4b93-874b-f49ba5c9df2d" containerName="extract-utilities" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.166618 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca549c4-0dcd-4b93-874b-f49ba5c9df2d" containerName="extract-utilities" Dec 02 09:38:07 crc kubenswrapper[4781]: E1202 09:38:07.166635 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca549c4-0dcd-4b93-874b-f49ba5c9df2d" containerName="registry-server" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.166641 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca549c4-0dcd-4b93-874b-f49ba5c9df2d" containerName="registry-server" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.166756 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ca549c4-0dcd-4b93-874b-f49ba5c9df2d" containerName="registry-server" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.167218 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mfpkp" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.169547 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.169563 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-p2rfl"] Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.170731 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-grn7v" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.172856 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.173436 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-mfpkp"] Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.175209 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.178172 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.261372 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-w7lhj"] Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.262575 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-w7lhj" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.264398 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.264558 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.264606 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-x7p7c" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.264710 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.295649 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-8fm55"] Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.296657 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-8fm55" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.300021 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.310693 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-8fm55"] Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.312877 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/920e48e1-7f21-462b-82da-70c9a6e589ba-frr-startup\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.312972 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4edb853-0703-424a-9701-bd01ffa5631c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-mfpkp\" (UID: \"e4edb853-0703-424a-9701-bd01ffa5631c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mfpkp" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.313016 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/920e48e1-7f21-462b-82da-70c9a6e589ba-metrics\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.313037 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/920e48e1-7f21-462b-82da-70c9a6e589ba-reloader\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.313067 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/920e48e1-7f21-462b-82da-70c9a6e589ba-metrics-certs\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.313089 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqczs\" (UniqueName: \"kubernetes.io/projected/e4edb853-0703-424a-9701-bd01ffa5631c-kube-api-access-gqczs\") pod \"frr-k8s-webhook-server-7fcb986d4-mfpkp\" (UID: \"e4edb853-0703-424a-9701-bd01ffa5631c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mfpkp" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.313119 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/920e48e1-7f21-462b-82da-70c9a6e589ba-frr-conf\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.313153 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/920e48e1-7f21-462b-82da-70c9a6e589ba-frr-sockets\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.313172 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6j4v\" (UniqueName: \"kubernetes.io/projected/920e48e1-7f21-462b-82da-70c9a6e589ba-kube-api-access-p6j4v\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.414604 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p9xt\" (UniqueName: \"kubernetes.io/projected/9033e241-ad62-4fcc-92f1-8499f42f6310-kube-api-access-5p9xt\") pod \"speaker-w7lhj\" (UID: \"9033e241-ad62-4fcc-92f1-8499f42f6310\") " pod="metallb-system/speaker-w7lhj" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.414680 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4edb853-0703-424a-9701-bd01ffa5631c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-mfpkp\" (UID: \"e4edb853-0703-424a-9701-bd01ffa5631c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mfpkp" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.414790 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8v4t\" (UniqueName: \"kubernetes.io/projected/b70a26b7-43cb-4e26-95c0-f67ef15a0c34-kube-api-access-x8v4t\") pod \"controller-f8648f98b-8fm55\" (UID: \"b70a26b7-43cb-4e26-95c0-f67ef15a0c34\") " pod="metallb-system/controller-f8648f98b-8fm55" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.414839 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b70a26b7-43cb-4e26-95c0-f67ef15a0c34-cert\") pod \"controller-f8648f98b-8fm55\" (UID: \"b70a26b7-43cb-4e26-95c0-f67ef15a0c34\") " pod="metallb-system/controller-f8648f98b-8fm55" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.414884 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9033e241-ad62-4fcc-92f1-8499f42f6310-metallb-excludel2\") pod \"speaker-w7lhj\" (UID: \"9033e241-ad62-4fcc-92f1-8499f42f6310\") " pod="metallb-system/speaker-w7lhj" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.414910 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/920e48e1-7f21-462b-82da-70c9a6e589ba-metrics\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.414943 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/920e48e1-7f21-462b-82da-70c9a6e589ba-reloader\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.414978 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9033e241-ad62-4fcc-92f1-8499f42f6310-metrics-certs\") pod \"speaker-w7lhj\" (UID: \"9033e241-ad62-4fcc-92f1-8499f42f6310\") " pod="metallb-system/speaker-w7lhj" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.415007 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/920e48e1-7f21-462b-82da-70c9a6e589ba-metrics-certs\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.415026 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqczs\" (UniqueName: \"kubernetes.io/projected/e4edb853-0703-424a-9701-bd01ffa5631c-kube-api-access-gqczs\") pod \"frr-k8s-webhook-server-7fcb986d4-mfpkp\" (UID: \"e4edb853-0703-424a-9701-bd01ffa5631c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mfpkp" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.415069 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/920e48e1-7f21-462b-82da-70c9a6e589ba-frr-conf\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.415095 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b70a26b7-43cb-4e26-95c0-f67ef15a0c34-metrics-certs\") pod \"controller-f8648f98b-8fm55\" (UID: \"b70a26b7-43cb-4e26-95c0-f67ef15a0c34\") " pod="metallb-system/controller-f8648f98b-8fm55" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.415153 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/920e48e1-7f21-462b-82da-70c9a6e589ba-frr-sockets\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.415177 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6j4v\" (UniqueName: \"kubernetes.io/projected/920e48e1-7f21-462b-82da-70c9a6e589ba-kube-api-access-p6j4v\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.415213 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/920e48e1-7f21-462b-82da-70c9a6e589ba-frr-startup\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.415234 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9033e241-ad62-4fcc-92f1-8499f42f6310-memberlist\") pod \"speaker-w7lhj\" (UID: \"9033e241-ad62-4fcc-92f1-8499f42f6310\") " pod="metallb-system/speaker-w7lhj" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.415635 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/920e48e1-7f21-462b-82da-70c9a6e589ba-metrics\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.415813 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/920e48e1-7f21-462b-82da-70c9a6e589ba-reloader\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.416697 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/920e48e1-7f21-462b-82da-70c9a6e589ba-frr-conf\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.416860 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/920e48e1-7f21-462b-82da-70c9a6e589ba-frr-sockets\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.417649 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/920e48e1-7f21-462b-82da-70c9a6e589ba-frr-startup\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.421787 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4edb853-0703-424a-9701-bd01ffa5631c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-mfpkp\" (UID: \"e4edb853-0703-424a-9701-bd01ffa5631c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mfpkp" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.424555 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/920e48e1-7f21-462b-82da-70c9a6e589ba-metrics-certs\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.435261 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqczs\" (UniqueName: \"kubernetes.io/projected/e4edb853-0703-424a-9701-bd01ffa5631c-kube-api-access-gqczs\") pod \"frr-k8s-webhook-server-7fcb986d4-mfpkp\" (UID: \"e4edb853-0703-424a-9701-bd01ffa5631c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mfpkp" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.435581 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6j4v\" (UniqueName: \"kubernetes.io/projected/920e48e1-7f21-462b-82da-70c9a6e589ba-kube-api-access-p6j4v\") pod \"frr-k8s-p2rfl\" (UID: \"920e48e1-7f21-462b-82da-70c9a6e589ba\") " pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.491989 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mfpkp" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.501958 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.516946 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9033e241-ad62-4fcc-92f1-8499f42f6310-metrics-certs\") pod \"speaker-w7lhj\" (UID: \"9033e241-ad62-4fcc-92f1-8499f42f6310\") " pod="metallb-system/speaker-w7lhj" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.517016 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b70a26b7-43cb-4e26-95c0-f67ef15a0c34-metrics-certs\") pod \"controller-f8648f98b-8fm55\" (UID: \"b70a26b7-43cb-4e26-95c0-f67ef15a0c34\") " pod="metallb-system/controller-f8648f98b-8fm55" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.517069 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9033e241-ad62-4fcc-92f1-8499f42f6310-memberlist\") pod \"speaker-w7lhj\" (UID: \"9033e241-ad62-4fcc-92f1-8499f42f6310\") " pod="metallb-system/speaker-w7lhj" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.517091 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p9xt\" (UniqueName: \"kubernetes.io/projected/9033e241-ad62-4fcc-92f1-8499f42f6310-kube-api-access-5p9xt\") pod \"speaker-w7lhj\" (UID: \"9033e241-ad62-4fcc-92f1-8499f42f6310\") " pod="metallb-system/speaker-w7lhj" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.517125 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8v4t\" (UniqueName: \"kubernetes.io/projected/b70a26b7-43cb-4e26-95c0-f67ef15a0c34-kube-api-access-x8v4t\") pod \"controller-f8648f98b-8fm55\" (UID: \"b70a26b7-43cb-4e26-95c0-f67ef15a0c34\") " pod="metallb-system/controller-f8648f98b-8fm55" Dec 02 09:38:07 crc kubenswrapper[4781]: E1202 09:38:07.517659 4781 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 02 09:38:07 crc kubenswrapper[4781]: E1202 09:38:07.517742 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9033e241-ad62-4fcc-92f1-8499f42f6310-memberlist podName:9033e241-ad62-4fcc-92f1-8499f42f6310 nodeName:}" failed. No retries permitted until 2025-12-02 09:38:08.017715913 +0000 UTC m=+1050.841589802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9033e241-ad62-4fcc-92f1-8499f42f6310-memberlist") pod "speaker-w7lhj" (UID: "9033e241-ad62-4fcc-92f1-8499f42f6310") : secret "metallb-memberlist" not found Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.517672 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b70a26b7-43cb-4e26-95c0-f67ef15a0c34-cert\") pod \"controller-f8648f98b-8fm55\" (UID: \"b70a26b7-43cb-4e26-95c0-f67ef15a0c34\") " pod="metallb-system/controller-f8648f98b-8fm55" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.517954 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9033e241-ad62-4fcc-92f1-8499f42f6310-metallb-excludel2\") pod \"speaker-w7lhj\" (UID: \"9033e241-ad62-4fcc-92f1-8499f42f6310\") " pod="metallb-system/speaker-w7lhj" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.518558 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9033e241-ad62-4fcc-92f1-8499f42f6310-metallb-excludel2\") pod \"speaker-w7lhj\" (UID: \"9033e241-ad62-4fcc-92f1-8499f42f6310\") " pod="metallb-system/speaker-w7lhj" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.519798 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.528574 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9033e241-ad62-4fcc-92f1-8499f42f6310-metrics-certs\") pod \"speaker-w7lhj\" (UID: \"9033e241-ad62-4fcc-92f1-8499f42f6310\") " pod="metallb-system/speaker-w7lhj" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.530779 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b70a26b7-43cb-4e26-95c0-f67ef15a0c34-metrics-certs\") pod \"controller-f8648f98b-8fm55\" (UID: \"b70a26b7-43cb-4e26-95c0-f67ef15a0c34\") " pod="metallb-system/controller-f8648f98b-8fm55" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.531413 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b70a26b7-43cb-4e26-95c0-f67ef15a0c34-cert\") pod \"controller-f8648f98b-8fm55\" (UID: \"b70a26b7-43cb-4e26-95c0-f67ef15a0c34\") " pod="metallb-system/controller-f8648f98b-8fm55" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.533610 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p9xt\" (UniqueName: \"kubernetes.io/projected/9033e241-ad62-4fcc-92f1-8499f42f6310-kube-api-access-5p9xt\") pod \"speaker-w7lhj\" (UID: \"9033e241-ad62-4fcc-92f1-8499f42f6310\") " pod="metallb-system/speaker-w7lhj" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.538353 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8v4t\" (UniqueName: \"kubernetes.io/projected/b70a26b7-43cb-4e26-95c0-f67ef15a0c34-kube-api-access-x8v4t\") pod \"controller-f8648f98b-8fm55\" (UID: \"b70a26b7-43cb-4e26-95c0-f67ef15a0c34\") " pod="metallb-system/controller-f8648f98b-8fm55" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.610458 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-8fm55" Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.779543 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-8fm55"] Dec 02 09:38:07 crc kubenswrapper[4781]: W1202 09:38:07.782771 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb70a26b7_43cb_4e26_95c0_f67ef15a0c34.slice/crio-c1887cdcda6834db3a8c72a5de4a23bfe8028aba65828ce5bd2e5586dc95ffd4 WatchSource:0}: Error finding container c1887cdcda6834db3a8c72a5de4a23bfe8028aba65828ce5bd2e5586dc95ffd4: Status 404 returned error can't find the container with id c1887cdcda6834db3a8c72a5de4a23bfe8028aba65828ce5bd2e5586dc95ffd4 Dec 02 09:38:07 crc kubenswrapper[4781]: I1202 09:38:07.951325 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-mfpkp"] Dec 02 09:38:07 crc kubenswrapper[4781]: W1202 09:38:07.963982 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4edb853_0703_424a_9701_bd01ffa5631c.slice/crio-fc67cfe59c38db3491666bdf91c0737c6321b91383eed8d7329648d0ff0f3e04 WatchSource:0}: Error finding container fc67cfe59c38db3491666bdf91c0737c6321b91383eed8d7329648d0ff0f3e04: Status 404 returned error can't find the container with id fc67cfe59c38db3491666bdf91c0737c6321b91383eed8d7329648d0ff0f3e04 Dec 02 09:38:08 crc kubenswrapper[4781]: I1202 09:38:08.025494 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9033e241-ad62-4fcc-92f1-8499f42f6310-memberlist\") pod \"speaker-w7lhj\" (UID: \"9033e241-ad62-4fcc-92f1-8499f42f6310\") " pod="metallb-system/speaker-w7lhj" Dec 02 09:38:08 crc kubenswrapper[4781]: E1202 09:38:08.025678 4781 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 02 09:38:08 crc kubenswrapper[4781]: E1202 09:38:08.025756 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9033e241-ad62-4fcc-92f1-8499f42f6310-memberlist podName:9033e241-ad62-4fcc-92f1-8499f42f6310 nodeName:}" failed. No retries permitted until 2025-12-02 09:38:09.025736237 +0000 UTC m=+1051.849610116 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9033e241-ad62-4fcc-92f1-8499f42f6310-memberlist") pod "speaker-w7lhj" (UID: "9033e241-ad62-4fcc-92f1-8499f42f6310") : secret "metallb-memberlist" not found Dec 02 09:38:08 crc kubenswrapper[4781]: I1202 09:38:08.276221 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mfpkp" event={"ID":"e4edb853-0703-424a-9701-bd01ffa5631c","Type":"ContainerStarted","Data":"fc67cfe59c38db3491666bdf91c0737c6321b91383eed8d7329648d0ff0f3e04"} Dec 02 09:38:08 crc kubenswrapper[4781]: I1202 09:38:08.278091 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-8fm55" event={"ID":"b70a26b7-43cb-4e26-95c0-f67ef15a0c34","Type":"ContainerStarted","Data":"736157e38b991e2639377a23246fe3af343bda82e0ce09fb4f3a895c7a4640b0"} Dec 02 09:38:08 crc kubenswrapper[4781]: I1202 09:38:08.278118 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-8fm55" event={"ID":"b70a26b7-43cb-4e26-95c0-f67ef15a0c34","Type":"ContainerStarted","Data":"02e50077c80ccc19eb67ea8824f9a822442d9633dd942e14839108f84c786f10"} Dec 02 09:38:08 crc kubenswrapper[4781]: I1202 09:38:08.278128 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-8fm55" event={"ID":"b70a26b7-43cb-4e26-95c0-f67ef15a0c34","Type":"ContainerStarted","Data":"c1887cdcda6834db3a8c72a5de4a23bfe8028aba65828ce5bd2e5586dc95ffd4"} Dec 02 09:38:08 crc kubenswrapper[4781]: I1202 09:38:08.278229 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-8fm55" Dec 02 09:38:08 crc kubenswrapper[4781]: I1202 09:38:08.279034 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p2rfl" event={"ID":"920e48e1-7f21-462b-82da-70c9a6e589ba","Type":"ContainerStarted","Data":"bb82fec6307d462ee5a3ac9b8ccea44d6fdf47dd048ed550ac01bebc39db7c44"} Dec 02 09:38:08 crc kubenswrapper[4781]: I1202 09:38:08.295198 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-8fm55" podStartSLOduration=1.295177016 podStartE2EDuration="1.295177016s" podCreationTimestamp="2025-12-02 09:38:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:38:08.292267118 +0000 UTC m=+1051.116140987" watchObservedRunningTime="2025-12-02 09:38:08.295177016 +0000 UTC m=+1051.119050895" Dec 02 09:38:09 crc kubenswrapper[4781]: I1202 09:38:09.038667 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9033e241-ad62-4fcc-92f1-8499f42f6310-memberlist\") pod \"speaker-w7lhj\" (UID: \"9033e241-ad62-4fcc-92f1-8499f42f6310\") " pod="metallb-system/speaker-w7lhj" Dec 02 09:38:09 crc kubenswrapper[4781]: I1202 09:38:09.044608 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9033e241-ad62-4fcc-92f1-8499f42f6310-memberlist\") pod \"speaker-w7lhj\" (UID: \"9033e241-ad62-4fcc-92f1-8499f42f6310\") " pod="metallb-system/speaker-w7lhj" Dec 02 09:38:09 crc kubenswrapper[4781]: I1202 09:38:09.081795 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-w7lhj" Dec 02 09:38:09 crc kubenswrapper[4781]: W1202 09:38:09.103224 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9033e241_ad62_4fcc_92f1_8499f42f6310.slice/crio-90b295248cf43af99b42cf5fa8afa4db61215789e1ace523c6e3f53c0e8fe1bc WatchSource:0}: Error finding container 90b295248cf43af99b42cf5fa8afa4db61215789e1ace523c6e3f53c0e8fe1bc: Status 404 returned error can't find the container with id 90b295248cf43af99b42cf5fa8afa4db61215789e1ace523c6e3f53c0e8fe1bc Dec 02 09:38:09 crc kubenswrapper[4781]: I1202 09:38:09.287131 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w7lhj" event={"ID":"9033e241-ad62-4fcc-92f1-8499f42f6310","Type":"ContainerStarted","Data":"90b295248cf43af99b42cf5fa8afa4db61215789e1ace523c6e3f53c0e8fe1bc"} Dec 02 09:38:10 crc kubenswrapper[4781]: I1202 09:38:10.309684 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w7lhj" event={"ID":"9033e241-ad62-4fcc-92f1-8499f42f6310","Type":"ContainerStarted","Data":"60b92ca786a9f061dffe021118877f3f0eba86dcd0649ecb5cb0bef0a14e96d7"} Dec 02 09:38:10 crc kubenswrapper[4781]: I1202 09:38:10.309727 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w7lhj" event={"ID":"9033e241-ad62-4fcc-92f1-8499f42f6310","Type":"ContainerStarted","Data":"29601ec7dbf6732f8a0d6fa801a0c35465f65f51dc418b4855c352ad19b18539"} Dec 02 09:38:10 crc kubenswrapper[4781]: I1202 09:38:10.311226 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-w7lhj" Dec 02 09:38:10 crc kubenswrapper[4781]: I1202 09:38:10.345423 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-w7lhj" podStartSLOduration=3.345392104 podStartE2EDuration="3.345392104s" podCreationTimestamp="2025-12-02 09:38:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:38:10.338707265 +0000 UTC m=+1053.162581154" watchObservedRunningTime="2025-12-02 09:38:10.345392104 +0000 UTC m=+1053.169266003" Dec 02 09:38:15 crc kubenswrapper[4781]: I1202 09:38:15.343510 4781 generic.go:334] "Generic (PLEG): container finished" podID="920e48e1-7f21-462b-82da-70c9a6e589ba" containerID="ce34fe34955b3310cca94ee2d8640eefa91ad9d510a99f9502262ba3165802d7" exitCode=0 Dec 02 09:38:15 crc kubenswrapper[4781]: I1202 09:38:15.343556 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p2rfl" event={"ID":"920e48e1-7f21-462b-82da-70c9a6e589ba","Type":"ContainerDied","Data":"ce34fe34955b3310cca94ee2d8640eefa91ad9d510a99f9502262ba3165802d7"} Dec 02 09:38:15 crc kubenswrapper[4781]: I1202 09:38:15.345621 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mfpkp" event={"ID":"e4edb853-0703-424a-9701-bd01ffa5631c","Type":"ContainerStarted","Data":"5a13a3c369992d906c521b2534f95de18f19bc31dd9488726ee172880d95d08d"} Dec 02 09:38:15 crc kubenswrapper[4781]: I1202 09:38:15.345755 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mfpkp" Dec 02 09:38:16 crc kubenswrapper[4781]: I1202 09:38:16.353514 4781 generic.go:334] "Generic (PLEG): container finished" podID="920e48e1-7f21-462b-82da-70c9a6e589ba" containerID="0e94983127500171a65513128f072cc2e9a9adf5dddf6a6c71e603493f8072bb" exitCode=0 Dec 02 09:38:16 crc kubenswrapper[4781]: I1202 09:38:16.353567 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p2rfl" event={"ID":"920e48e1-7f21-462b-82da-70c9a6e589ba","Type":"ContainerDied","Data":"0e94983127500171a65513128f072cc2e9a9adf5dddf6a6c71e603493f8072bb"} Dec 02 09:38:16 crc kubenswrapper[4781]: I1202 09:38:16.398840 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mfpkp" podStartSLOduration=2.717562215 podStartE2EDuration="9.398820867s" podCreationTimestamp="2025-12-02 09:38:07 +0000 UTC" firstStartedPulling="2025-12-02 09:38:07.965419293 +0000 UTC m=+1050.789293172" lastFinishedPulling="2025-12-02 09:38:14.646677955 +0000 UTC m=+1057.470551824" observedRunningTime="2025-12-02 09:38:15.402377085 +0000 UTC m=+1058.226250964" watchObservedRunningTime="2025-12-02 09:38:16.398820867 +0000 UTC m=+1059.222694766" Dec 02 09:38:17 crc kubenswrapper[4781]: I1202 09:38:17.362084 4781 generic.go:334] "Generic (PLEG): container finished" podID="920e48e1-7f21-462b-82da-70c9a6e589ba" containerID="cf8285001e6d132839d0f8416a019c6d53ffcb68d26d2e92204c9c8cee995bb1" exitCode=0 Dec 02 09:38:17 crc kubenswrapper[4781]: I1202 09:38:17.362124 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p2rfl" event={"ID":"920e48e1-7f21-462b-82da-70c9a6e589ba","Type":"ContainerDied","Data":"cf8285001e6d132839d0f8416a019c6d53ffcb68d26d2e92204c9c8cee995bb1"} Dec 02 09:38:17 crc kubenswrapper[4781]: I1202 09:38:17.615220 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-8fm55" Dec 02 09:38:19 crc kubenswrapper[4781]: I1202 09:38:19.087319 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-w7lhj" Dec 02 09:38:19 crc kubenswrapper[4781]: I1202 09:38:19.377995 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p2rfl" event={"ID":"920e48e1-7f21-462b-82da-70c9a6e589ba","Type":"ContainerStarted","Data":"d9585780e3ffb4f840141cb7f02ffb0deafb9e3238fbdf657c0c49a382a28bc2"} Dec 02 09:38:19 crc kubenswrapper[4781]: I1202 09:38:19.378038 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p2rfl" event={"ID":"920e48e1-7f21-462b-82da-70c9a6e589ba","Type":"ContainerStarted","Data":"b86c7e5fc7d59b4d60ec5660e31c830188c918f3fbe8c8a2281a5debbec02817"} Dec 02 09:38:20 crc kubenswrapper[4781]: I1202 09:38:20.386767 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p2rfl" event={"ID":"920e48e1-7f21-462b-82da-70c9a6e589ba","Type":"ContainerStarted","Data":"d303bf6cae63217648fbbd8262a6ee079c698fd6abe3de0636b05b866a893283"} Dec 02 09:38:20 crc kubenswrapper[4781]: I1202 09:38:20.387131 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p2rfl" event={"ID":"920e48e1-7f21-462b-82da-70c9a6e589ba","Type":"ContainerStarted","Data":"688c6ca20efccacba8f805f74ee8fd78a36875abfe9f4a11c20e7f6de78b8dd1"} Dec 02 09:38:21 crc kubenswrapper[4781]: I1202 09:38:21.398109 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p2rfl" event={"ID":"920e48e1-7f21-462b-82da-70c9a6e589ba","Type":"ContainerStarted","Data":"044f150820bddfaf05bdcd96c233ba82b785120b43f13115dd72ef868b3ceffa"} Dec 02 09:38:21 crc kubenswrapper[4781]: I1202 09:38:21.398152 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p2rfl" event={"ID":"920e48e1-7f21-462b-82da-70c9a6e589ba","Type":"ContainerStarted","Data":"be628f761fa6136823b324174c2b162ce2d6bb2396b29cf3c6744f6e429b7578"} Dec 02 09:38:21 crc kubenswrapper[4781]: I1202 09:38:21.398258 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:21 crc kubenswrapper[4781]: I1202 09:38:21.427744 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-p2rfl" podStartSLOduration=7.445210504 podStartE2EDuration="14.427713286s" podCreationTimestamp="2025-12-02 09:38:07 +0000 UTC" firstStartedPulling="2025-12-02 09:38:07.669492614 +0000 UTC m=+1050.493366493" lastFinishedPulling="2025-12-02 09:38:14.651995396 +0000 UTC m=+1057.475869275" observedRunningTime="2025-12-02 09:38:21.420252016 +0000 UTC m=+1064.244125905" watchObservedRunningTime="2025-12-02 09:38:21.427713286 +0000 UTC m=+1064.251587205" Dec 02 09:38:22 crc kubenswrapper[4781]: I1202 09:38:22.130647 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jnwgz"] Dec 02 09:38:22 crc kubenswrapper[4781]: I1202 09:38:22.132080 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jnwgz" Dec 02 09:38:22 crc kubenswrapper[4781]: I1202 09:38:22.137972 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 02 09:38:22 crc kubenswrapper[4781]: I1202 09:38:22.148406 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-r4qcw" Dec 02 09:38:22 crc kubenswrapper[4781]: I1202 09:38:22.148507 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 02 09:38:22 crc kubenswrapper[4781]: I1202 09:38:22.151077 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jnwgz"] Dec 02 09:38:22 crc kubenswrapper[4781]: I1202 09:38:22.205655 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf75s\" (UniqueName: \"kubernetes.io/projected/637bc30c-b548-44fc-81ec-b19d27bde6a3-kube-api-access-bf75s\") pod \"openstack-operator-index-jnwgz\" (UID: \"637bc30c-b548-44fc-81ec-b19d27bde6a3\") " pod="openstack-operators/openstack-operator-index-jnwgz" Dec 02 09:38:22 crc kubenswrapper[4781]: I1202 09:38:22.306880 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf75s\" (UniqueName: \"kubernetes.io/projected/637bc30c-b548-44fc-81ec-b19d27bde6a3-kube-api-access-bf75s\") pod \"openstack-operator-index-jnwgz\" (UID: \"637bc30c-b548-44fc-81ec-b19d27bde6a3\") " pod="openstack-operators/openstack-operator-index-jnwgz" Dec 02 09:38:22 crc kubenswrapper[4781]: I1202 09:38:22.331859 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf75s\" (UniqueName: \"kubernetes.io/projected/637bc30c-b548-44fc-81ec-b19d27bde6a3-kube-api-access-bf75s\") pod \"openstack-operator-index-jnwgz\" (UID: \"637bc30c-b548-44fc-81ec-b19d27bde6a3\") " pod="openstack-operators/openstack-operator-index-jnwgz" Dec 02 09:38:22 crc kubenswrapper[4781]: I1202 09:38:22.462719 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jnwgz" Dec 02 09:38:22 crc kubenswrapper[4781]: I1202 09:38:22.502462 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:22 crc kubenswrapper[4781]: I1202 09:38:22.547308 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:22 crc kubenswrapper[4781]: I1202 09:38:22.710284 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jnwgz"] Dec 02 09:38:22 crc kubenswrapper[4781]: W1202 09:38:22.713774 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod637bc30c_b548_44fc_81ec_b19d27bde6a3.slice/crio-ce6c708a0077b6ebb1498f24210333e3a303e7fddf7937f9aea1fd1cca79450a WatchSource:0}: Error finding container ce6c708a0077b6ebb1498f24210333e3a303e7fddf7937f9aea1fd1cca79450a: Status 404 returned error can't find the container with id ce6c708a0077b6ebb1498f24210333e3a303e7fddf7937f9aea1fd1cca79450a Dec 02 09:38:23 crc kubenswrapper[4781]: I1202 09:38:23.408775 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jnwgz" event={"ID":"637bc30c-b548-44fc-81ec-b19d27bde6a3","Type":"ContainerStarted","Data":"ce6c708a0077b6ebb1498f24210333e3a303e7fddf7937f9aea1fd1cca79450a"} Dec 02 09:38:25 crc kubenswrapper[4781]: I1202 09:38:25.710538 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jnwgz"] Dec 02 09:38:26 crc kubenswrapper[4781]: I1202 09:38:26.319707 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-xg78j"] Dec 02 09:38:26 crc kubenswrapper[4781]: I1202 09:38:26.321332 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xg78j" Dec 02 09:38:26 crc kubenswrapper[4781]: I1202 09:38:26.324054 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xg78j"] Dec 02 09:38:26 crc kubenswrapper[4781]: I1202 09:38:26.359051 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwfgm\" (UniqueName: \"kubernetes.io/projected/0ac791f1-2459-4266-a082-498b66e549b4-kube-api-access-rwfgm\") pod \"openstack-operator-index-xg78j\" (UID: \"0ac791f1-2459-4266-a082-498b66e549b4\") " pod="openstack-operators/openstack-operator-index-xg78j" Dec 02 09:38:26 crc kubenswrapper[4781]: I1202 09:38:26.459828 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwfgm\" (UniqueName: \"kubernetes.io/projected/0ac791f1-2459-4266-a082-498b66e549b4-kube-api-access-rwfgm\") pod \"openstack-operator-index-xg78j\" (UID: \"0ac791f1-2459-4266-a082-498b66e549b4\") " pod="openstack-operators/openstack-operator-index-xg78j" Dec 02 09:38:26 crc kubenswrapper[4781]: I1202 09:38:26.479709 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwfgm\" (UniqueName: \"kubernetes.io/projected/0ac791f1-2459-4266-a082-498b66e549b4-kube-api-access-rwfgm\") pod \"openstack-operator-index-xg78j\" (UID: \"0ac791f1-2459-4266-a082-498b66e549b4\") " pod="openstack-operators/openstack-operator-index-xg78j" Dec 02 09:38:26 crc kubenswrapper[4781]: I1202 09:38:26.644861 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xg78j" Dec 02 09:38:27 crc kubenswrapper[4781]: I1202 09:38:27.513702 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-mfpkp" Dec 02 09:38:27 crc kubenswrapper[4781]: I1202 09:38:27.955188 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xg78j"] Dec 02 09:38:28 crc kubenswrapper[4781]: I1202 09:38:28.442667 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xg78j" event={"ID":"0ac791f1-2459-4266-a082-498b66e549b4","Type":"ContainerStarted","Data":"4ae4157ec2c4d7936f5646cd58ccfa8bdac8495e427fcfdebd87da35e065f514"} Dec 02 09:38:29 crc kubenswrapper[4781]: I1202 09:38:29.450960 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jnwgz" event={"ID":"637bc30c-b548-44fc-81ec-b19d27bde6a3","Type":"ContainerStarted","Data":"7a1ac0cf808ad8a06ff98b5fc70f70288e95143876b90e1aed4c62039ac4124e"} Dec 02 09:38:29 crc kubenswrapper[4781]: I1202 09:38:29.451138 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-jnwgz" podUID="637bc30c-b548-44fc-81ec-b19d27bde6a3" containerName="registry-server" containerID="cri-o://7a1ac0cf808ad8a06ff98b5fc70f70288e95143876b90e1aed4c62039ac4124e" gracePeriod=2 Dec 02 09:38:29 crc kubenswrapper[4781]: I1202 09:38:29.453250 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xg78j" event={"ID":"0ac791f1-2459-4266-a082-498b66e549b4","Type":"ContainerStarted","Data":"a60ab4b6692ff5560b87bd22bd3866d05c99ca17bcb5974ff68e4b354506c0a5"} Dec 02 09:38:29 crc kubenswrapper[4781]: I1202 09:38:29.476059 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jnwgz" podStartSLOduration=2.414719889 podStartE2EDuration="7.476041177s" podCreationTimestamp="2025-12-02 09:38:22 +0000 UTC" firstStartedPulling="2025-12-02 09:38:22.718155994 +0000 UTC m=+1065.542029863" lastFinishedPulling="2025-12-02 09:38:27.779477272 +0000 UTC m=+1070.603351151" observedRunningTime="2025-12-02 09:38:29.467077208 +0000 UTC m=+1072.290951107" watchObservedRunningTime="2025-12-02 09:38:29.476041177 +0000 UTC m=+1072.299915056" Dec 02 09:38:29 crc kubenswrapper[4781]: I1202 09:38:29.484659 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-xg78j" podStartSLOduration=2.467957253 podStartE2EDuration="3.484646737s" podCreationTimestamp="2025-12-02 09:38:26 +0000 UTC" firstStartedPulling="2025-12-02 09:38:27.961829812 +0000 UTC m=+1070.785703691" lastFinishedPulling="2025-12-02 09:38:28.978519296 +0000 UTC m=+1071.802393175" observedRunningTime="2025-12-02 09:38:29.483153558 +0000 UTC m=+1072.307027437" watchObservedRunningTime="2025-12-02 09:38:29.484646737 +0000 UTC m=+1072.308520626" Dec 02 09:38:29 crc kubenswrapper[4781]: I1202 09:38:29.801364 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jnwgz" Dec 02 09:38:29 crc kubenswrapper[4781]: I1202 09:38:29.905173 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf75s\" (UniqueName: \"kubernetes.io/projected/637bc30c-b548-44fc-81ec-b19d27bde6a3-kube-api-access-bf75s\") pod \"637bc30c-b548-44fc-81ec-b19d27bde6a3\" (UID: \"637bc30c-b548-44fc-81ec-b19d27bde6a3\") " Dec 02 09:38:29 crc kubenswrapper[4781]: I1202 09:38:29.910218 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637bc30c-b548-44fc-81ec-b19d27bde6a3-kube-api-access-bf75s" (OuterVolumeSpecName: "kube-api-access-bf75s") pod "637bc30c-b548-44fc-81ec-b19d27bde6a3" (UID: "637bc30c-b548-44fc-81ec-b19d27bde6a3"). InnerVolumeSpecName "kube-api-access-bf75s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:38:30 crc kubenswrapper[4781]: I1202 09:38:30.006248 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf75s\" (UniqueName: \"kubernetes.io/projected/637bc30c-b548-44fc-81ec-b19d27bde6a3-kube-api-access-bf75s\") on node \"crc\" DevicePath \"\"" Dec 02 09:38:30 crc kubenswrapper[4781]: I1202 09:38:30.460690 4781 generic.go:334] "Generic (PLEG): container finished" podID="637bc30c-b548-44fc-81ec-b19d27bde6a3" containerID="7a1ac0cf808ad8a06ff98b5fc70f70288e95143876b90e1aed4c62039ac4124e" exitCode=0 Dec 02 09:38:30 crc kubenswrapper[4781]: I1202 09:38:30.460833 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jnwgz" event={"ID":"637bc30c-b548-44fc-81ec-b19d27bde6a3","Type":"ContainerDied","Data":"7a1ac0cf808ad8a06ff98b5fc70f70288e95143876b90e1aed4c62039ac4124e"} Dec 02 09:38:30 crc kubenswrapper[4781]: I1202 09:38:30.460891 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jnwgz" event={"ID":"637bc30c-b548-44fc-81ec-b19d27bde6a3","Type":"ContainerDied","Data":"ce6c708a0077b6ebb1498f24210333e3a303e7fddf7937f9aea1fd1cca79450a"} Dec 02 09:38:30 crc kubenswrapper[4781]: I1202 09:38:30.460916 4781 scope.go:117] "RemoveContainer" containerID="7a1ac0cf808ad8a06ff98b5fc70f70288e95143876b90e1aed4c62039ac4124e" Dec 02 09:38:30 crc kubenswrapper[4781]: I1202 09:38:30.460962 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jnwgz" Dec 02 09:38:30 crc kubenswrapper[4781]: I1202 09:38:30.481021 4781 scope.go:117] "RemoveContainer" containerID="7a1ac0cf808ad8a06ff98b5fc70f70288e95143876b90e1aed4c62039ac4124e" Dec 02 09:38:30 crc kubenswrapper[4781]: E1202 09:38:30.481408 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a1ac0cf808ad8a06ff98b5fc70f70288e95143876b90e1aed4c62039ac4124e\": container with ID starting with 7a1ac0cf808ad8a06ff98b5fc70f70288e95143876b90e1aed4c62039ac4124e not found: ID does not exist" containerID="7a1ac0cf808ad8a06ff98b5fc70f70288e95143876b90e1aed4c62039ac4124e" Dec 02 09:38:30 crc kubenswrapper[4781]: I1202 09:38:30.481444 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a1ac0cf808ad8a06ff98b5fc70f70288e95143876b90e1aed4c62039ac4124e"} err="failed to get container status \"7a1ac0cf808ad8a06ff98b5fc70f70288e95143876b90e1aed4c62039ac4124e\": rpc error: code = NotFound desc = could not find container \"7a1ac0cf808ad8a06ff98b5fc70f70288e95143876b90e1aed4c62039ac4124e\": container with ID starting with 7a1ac0cf808ad8a06ff98b5fc70f70288e95143876b90e1aed4c62039ac4124e not found: ID does not exist" Dec 02 09:38:30 crc kubenswrapper[4781]: I1202 09:38:30.495006 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jnwgz"] Dec 02 09:38:30 crc kubenswrapper[4781]: I1202 09:38:30.499450 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-jnwgz"] Dec 02 09:38:31 crc kubenswrapper[4781]: I1202 09:38:31.507585 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="637bc30c-b548-44fc-81ec-b19d27bde6a3" path="/var/lib/kubelet/pods/637bc30c-b548-44fc-81ec-b19d27bde6a3/volumes" Dec 02 09:38:36 crc kubenswrapper[4781]: I1202 09:38:36.645613 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-xg78j" Dec 02 09:38:36 crc kubenswrapper[4781]: I1202 09:38:36.646143 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-xg78j" Dec 02 09:38:36 crc kubenswrapper[4781]: I1202 09:38:36.670464 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-xg78j" Dec 02 09:38:37 crc kubenswrapper[4781]: I1202 09:38:37.527327 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-p2rfl" Dec 02 09:38:37 crc kubenswrapper[4781]: I1202 09:38:37.545044 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-xg78j" Dec 02 09:38:38 crc kubenswrapper[4781]: I1202 09:38:38.944021 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl"] Dec 02 09:38:38 crc kubenswrapper[4781]: E1202 09:38:38.944520 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637bc30c-b548-44fc-81ec-b19d27bde6a3" containerName="registry-server" Dec 02 09:38:38 crc kubenswrapper[4781]: I1202 09:38:38.944534 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="637bc30c-b548-44fc-81ec-b19d27bde6a3" containerName="registry-server" Dec 02 09:38:38 crc kubenswrapper[4781]: I1202 09:38:38.944639 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="637bc30c-b548-44fc-81ec-b19d27bde6a3" containerName="registry-server" Dec 02 09:38:38 crc kubenswrapper[4781]: I1202 09:38:38.945498 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" Dec 02 09:38:38 crc kubenswrapper[4781]: I1202 09:38:38.948101 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-588x7" Dec 02 09:38:38 crc kubenswrapper[4781]: I1202 09:38:38.959704 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl"] Dec 02 09:38:39 crc kubenswrapper[4781]: I1202 09:38:39.040041 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-util\") pod \"fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl\" (UID: \"5da922a2-736d-4e49-b3f3-68adcbfc8d0b\") " pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" Dec 02 09:38:39 crc kubenswrapper[4781]: I1202 09:38:39.040089 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-bundle\") pod \"fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl\" (UID: \"5da922a2-736d-4e49-b3f3-68adcbfc8d0b\") " pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" Dec 02 09:38:39 crc kubenswrapper[4781]: I1202 09:38:39.040129 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkhc8\" (UniqueName: \"kubernetes.io/projected/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-kube-api-access-pkhc8\") pod \"fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl\" (UID: \"5da922a2-736d-4e49-b3f3-68adcbfc8d0b\") " pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" Dec 02 09:38:39 crc kubenswrapper[4781]: I1202 09:38:39.141907 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-util\") pod \"fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl\" (UID: \"5da922a2-736d-4e49-b3f3-68adcbfc8d0b\") " pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" Dec 02 09:38:39 crc kubenswrapper[4781]: I1202 09:38:39.142035 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-bundle\") pod \"fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl\" (UID: \"5da922a2-736d-4e49-b3f3-68adcbfc8d0b\") " pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" Dec 02 09:38:39 crc kubenswrapper[4781]: I1202 09:38:39.142076 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkhc8\" (UniqueName: \"kubernetes.io/projected/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-kube-api-access-pkhc8\") pod \"fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl\" (UID: \"5da922a2-736d-4e49-b3f3-68adcbfc8d0b\") " pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" Dec 02 09:38:39 crc kubenswrapper[4781]: I1202 09:38:39.142604 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-util\") pod \"fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl\" (UID: \"5da922a2-736d-4e49-b3f3-68adcbfc8d0b\") " pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" Dec 02 09:38:39 crc kubenswrapper[4781]: I1202 09:38:39.142711 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-bundle\") pod \"fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl\" (UID: \"5da922a2-736d-4e49-b3f3-68adcbfc8d0b\") " pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" Dec 02 09:38:39 crc kubenswrapper[4781]: I1202 09:38:39.161977 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkhc8\" (UniqueName: \"kubernetes.io/projected/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-kube-api-access-pkhc8\") pod \"fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl\" (UID: \"5da922a2-736d-4e49-b3f3-68adcbfc8d0b\") " pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" Dec 02 09:38:39 crc kubenswrapper[4781]: I1202 09:38:39.293355 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" Dec 02 09:38:39 crc kubenswrapper[4781]: I1202 09:38:39.672184 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl"] Dec 02 09:38:39 crc kubenswrapper[4781]: W1202 09:38:39.678353 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5da922a2_736d_4e49_b3f3_68adcbfc8d0b.slice/crio-c62310f7381b4b9617cf243183b93de204eff0c60dee1d346ab15afe7ba57414 WatchSource:0}: Error finding container c62310f7381b4b9617cf243183b93de204eff0c60dee1d346ab15afe7ba57414: Status 404 returned error can't find the container with id c62310f7381b4b9617cf243183b93de204eff0c60dee1d346ab15afe7ba57414 Dec 02 09:38:40 crc kubenswrapper[4781]: I1202 09:38:40.523626 4781 generic.go:334] "Generic (PLEG): container finished" podID="5da922a2-736d-4e49-b3f3-68adcbfc8d0b" containerID="6fa2f78b11cdb9940b6359b5765e885f7012711f76937934e0dc845a6037bbb4" exitCode=0 Dec 02 09:38:40 crc kubenswrapper[4781]: I1202 09:38:40.523670 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" event={"ID":"5da922a2-736d-4e49-b3f3-68adcbfc8d0b","Type":"ContainerDied","Data":"6fa2f78b11cdb9940b6359b5765e885f7012711f76937934e0dc845a6037bbb4"} Dec 02 09:38:40 crc kubenswrapper[4781]: I1202 09:38:40.523876 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" event={"ID":"5da922a2-736d-4e49-b3f3-68adcbfc8d0b","Type":"ContainerStarted","Data":"c62310f7381b4b9617cf243183b93de204eff0c60dee1d346ab15afe7ba57414"} Dec 02 09:38:41 crc kubenswrapper[4781]: I1202 09:38:41.532321 4781 generic.go:334] "Generic (PLEG): container finished" podID="5da922a2-736d-4e49-b3f3-68adcbfc8d0b" containerID="fe63710c41aa1d461b77d4b2521e2c40619ccf5b0c87681c87c1a165586ca63f" exitCode=0 Dec 02 09:38:41 crc kubenswrapper[4781]: I1202 09:38:41.532425 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" event={"ID":"5da922a2-736d-4e49-b3f3-68adcbfc8d0b","Type":"ContainerDied","Data":"fe63710c41aa1d461b77d4b2521e2c40619ccf5b0c87681c87c1a165586ca63f"} Dec 02 09:38:42 crc kubenswrapper[4781]: I1202 09:38:42.544255 4781 generic.go:334] "Generic (PLEG): container finished" podID="5da922a2-736d-4e49-b3f3-68adcbfc8d0b" containerID="fbd5ecaf32c3e8f0796ae44efb3423603aa7c82d2c9963b57a48aaa46c04fbb8" exitCode=0 Dec 02 09:38:42 crc kubenswrapper[4781]: I1202 09:38:42.544346 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" event={"ID":"5da922a2-736d-4e49-b3f3-68adcbfc8d0b","Type":"ContainerDied","Data":"fbd5ecaf32c3e8f0796ae44efb3423603aa7c82d2c9963b57a48aaa46c04fbb8"} Dec 02 09:38:43 crc kubenswrapper[4781]: I1202 09:38:43.882267 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" Dec 02 09:38:44 crc kubenswrapper[4781]: I1202 09:38:44.011542 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-bundle\") pod \"5da922a2-736d-4e49-b3f3-68adcbfc8d0b\" (UID: \"5da922a2-736d-4e49-b3f3-68adcbfc8d0b\") " Dec 02 09:38:44 crc kubenswrapper[4781]: I1202 09:38:44.011638 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkhc8\" (UniqueName: \"kubernetes.io/projected/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-kube-api-access-pkhc8\") pod \"5da922a2-736d-4e49-b3f3-68adcbfc8d0b\" (UID: \"5da922a2-736d-4e49-b3f3-68adcbfc8d0b\") " Dec 02 09:38:44 crc kubenswrapper[4781]: I1202 09:38:44.012401 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-bundle" (OuterVolumeSpecName: "bundle") pod "5da922a2-736d-4e49-b3f3-68adcbfc8d0b" (UID: "5da922a2-736d-4e49-b3f3-68adcbfc8d0b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:38:44 crc kubenswrapper[4781]: I1202 09:38:44.012831 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-util\") pod \"5da922a2-736d-4e49-b3f3-68adcbfc8d0b\" (UID: \"5da922a2-736d-4e49-b3f3-68adcbfc8d0b\") " Dec 02 09:38:44 crc kubenswrapper[4781]: I1202 09:38:44.013284 4781 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:38:44 crc kubenswrapper[4781]: I1202 09:38:44.018870 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-kube-api-access-pkhc8" (OuterVolumeSpecName: "kube-api-access-pkhc8") pod "5da922a2-736d-4e49-b3f3-68adcbfc8d0b" (UID: "5da922a2-736d-4e49-b3f3-68adcbfc8d0b"). InnerVolumeSpecName "kube-api-access-pkhc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:38:44 crc kubenswrapper[4781]: I1202 09:38:44.034566 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-util" (OuterVolumeSpecName: "util") pod "5da922a2-736d-4e49-b3f3-68adcbfc8d0b" (UID: "5da922a2-736d-4e49-b3f3-68adcbfc8d0b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:38:44 crc kubenswrapper[4781]: I1202 09:38:44.114774 4781 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-util\") on node \"crc\" DevicePath \"\"" Dec 02 09:38:44 crc kubenswrapper[4781]: I1202 09:38:44.114815 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkhc8\" (UniqueName: \"kubernetes.io/projected/5da922a2-736d-4e49-b3f3-68adcbfc8d0b-kube-api-access-pkhc8\") on node \"crc\" DevicePath \"\"" Dec 02 09:38:44 crc kubenswrapper[4781]: I1202 09:38:44.557543 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" event={"ID":"5da922a2-736d-4e49-b3f3-68adcbfc8d0b","Type":"ContainerDied","Data":"c62310f7381b4b9617cf243183b93de204eff0c60dee1d346ab15afe7ba57414"} Dec 02 09:38:44 crc kubenswrapper[4781]: I1202 09:38:44.557579 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c62310f7381b4b9617cf243183b93de204eff0c60dee1d346ab15afe7ba57414" Dec 02 09:38:44 crc kubenswrapper[4781]: I1202 09:38:44.557593 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl" Dec 02 09:38:51 crc kubenswrapper[4781]: I1202 09:38:51.157483 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-65bb796c9b-zrllk"] Dec 02 09:38:51 crc kubenswrapper[4781]: E1202 09:38:51.158261 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da922a2-736d-4e49-b3f3-68adcbfc8d0b" containerName="pull" Dec 02 09:38:51 crc kubenswrapper[4781]: I1202 09:38:51.158275 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da922a2-736d-4e49-b3f3-68adcbfc8d0b" containerName="pull" Dec 02 09:38:51 crc kubenswrapper[4781]: E1202 09:38:51.158305 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da922a2-736d-4e49-b3f3-68adcbfc8d0b" containerName="util" Dec 02 09:38:51 crc kubenswrapper[4781]: I1202 09:38:51.158312 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da922a2-736d-4e49-b3f3-68adcbfc8d0b" containerName="util" Dec 02 09:38:51 crc kubenswrapper[4781]: E1202 09:38:51.158323 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da922a2-736d-4e49-b3f3-68adcbfc8d0b" containerName="extract" Dec 02 09:38:51 crc kubenswrapper[4781]: I1202 09:38:51.158332 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da922a2-736d-4e49-b3f3-68adcbfc8d0b" containerName="extract" Dec 02 09:38:51 crc kubenswrapper[4781]: I1202 09:38:51.158459 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da922a2-736d-4e49-b3f3-68adcbfc8d0b" containerName="extract" Dec 02 09:38:51 crc kubenswrapper[4781]: I1202 09:38:51.158976 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-65bb796c9b-zrllk" Dec 02 09:38:51 crc kubenswrapper[4781]: I1202 09:38:51.160891 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-n28cd" Dec 02 09:38:51 crc kubenswrapper[4781]: I1202 09:38:51.177422 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-65bb796c9b-zrllk"] Dec 02 09:38:51 crc kubenswrapper[4781]: I1202 09:38:51.210203 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4ftq\" (UniqueName: \"kubernetes.io/projected/09f22b82-ec27-4398-b843-8be7661ed03a-kube-api-access-k4ftq\") pod \"openstack-operator-controller-operator-65bb796c9b-zrllk\" (UID: \"09f22b82-ec27-4398-b843-8be7661ed03a\") " pod="openstack-operators/openstack-operator-controller-operator-65bb796c9b-zrllk" Dec 02 09:38:51 crc kubenswrapper[4781]: I1202 09:38:51.311964 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4ftq\" (UniqueName: \"kubernetes.io/projected/09f22b82-ec27-4398-b843-8be7661ed03a-kube-api-access-k4ftq\") pod \"openstack-operator-controller-operator-65bb796c9b-zrllk\" (UID: \"09f22b82-ec27-4398-b843-8be7661ed03a\") " pod="openstack-operators/openstack-operator-controller-operator-65bb796c9b-zrllk" Dec 02 09:38:51 crc kubenswrapper[4781]: I1202 09:38:51.334211 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4ftq\" (UniqueName: \"kubernetes.io/projected/09f22b82-ec27-4398-b843-8be7661ed03a-kube-api-access-k4ftq\") pod \"openstack-operator-controller-operator-65bb796c9b-zrllk\" (UID: \"09f22b82-ec27-4398-b843-8be7661ed03a\") " pod="openstack-operators/openstack-operator-controller-operator-65bb796c9b-zrllk" Dec 02 09:38:51 crc kubenswrapper[4781]: I1202 09:38:51.476419 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-65bb796c9b-zrllk" Dec 02 09:38:52 crc kubenswrapper[4781]: I1202 09:38:52.004507 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-65bb796c9b-zrllk"] Dec 02 09:38:52 crc kubenswrapper[4781]: I1202 09:38:52.609034 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-65bb796c9b-zrllk" event={"ID":"09f22b82-ec27-4398-b843-8be7661ed03a","Type":"ContainerStarted","Data":"aeea4ba323d9a49056e2999659f89f4679a5fa3aa3a50aeaf6a4b8ab84c8ba52"} Dec 02 09:38:57 crc kubenswrapper[4781]: I1202 09:38:57.649112 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-65bb796c9b-zrllk" event={"ID":"09f22b82-ec27-4398-b843-8be7661ed03a","Type":"ContainerStarted","Data":"f6eb401955311d8677c6024c0ce26415f063cc0eac44fc93c7bfb3371f572a48"} Dec 02 09:38:57 crc kubenswrapper[4781]: I1202 09:38:57.649623 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-65bb796c9b-zrllk" Dec 02 09:38:57 crc kubenswrapper[4781]: I1202 09:38:57.672210 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-65bb796c9b-zrllk" podStartSLOduration=2.148682664 podStartE2EDuration="6.672191919s" podCreationTimestamp="2025-12-02 09:38:51 +0000 UTC" firstStartedPulling="2025-12-02 09:38:52.015058241 +0000 UTC m=+1094.838932120" lastFinishedPulling="2025-12-02 09:38:56.538567496 +0000 UTC m=+1099.362441375" observedRunningTime="2025-12-02 09:38:57.670608977 +0000 UTC m=+1100.494482866" watchObservedRunningTime="2025-12-02 09:38:57.672191919 +0000 UTC m=+1100.496065798" Dec 02 09:39:11 crc kubenswrapper[4781]: I1202 09:39:11.480011 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-65bb796c9b-zrllk" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.022120 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-8tkh8"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.023480 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-8tkh8" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.026162 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-2tgdd" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.034510 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-8tkh8"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.057012 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-7q6gr"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.057905 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7q6gr" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.060029 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-hwxdh" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.065489 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-jc6cp"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.066375 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jc6cp" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.071438 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-nft6v" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.077769 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-7q6gr"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.084026 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-dz7tw"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.085284 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-dz7tw" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.090512 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-jc6cp"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.093432 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-xxdpv" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.097351 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-wcvq5"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.099015 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-wcvq5" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.102501 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xj5md" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.114905 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gplp6"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.116565 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gplp6" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.119081 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-2h9bq" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.126466 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.128040 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.131876 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-8qlgv" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.132104 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.140848 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-wcvq5"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.163749 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-dz7tw"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.171433 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-qpdqt"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.172697 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qpdqt" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.175389 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-7wqdw" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.180215 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6khk\" (UniqueName: \"kubernetes.io/projected/0a36fb64-e101-44af-a6f9-91fb68fc1e7a-kube-api-access-t6khk\") pod \"heat-operator-controller-manager-5f64f6f8bb-wcvq5\" (UID: \"0a36fb64-e101-44af-a6f9-91fb68fc1e7a\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-wcvq5" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.180293 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5ttw\" (UniqueName: \"kubernetes.io/projected/15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89-kube-api-access-g5ttw\") pod \"horizon-operator-controller-manager-68c6d99b8f-gplp6\" (UID: \"15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gplp6" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.180333 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpb7x\" (UniqueName: \"kubernetes.io/projected/eb3d207d-118c-42b8-9e9a-103a041a44b3-kube-api-access-wpb7x\") pod \"designate-operator-controller-manager-78b4bc895b-7q6gr\" (UID: \"eb3d207d-118c-42b8-9e9a-103a041a44b3\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7q6gr" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.180358 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmhwg\" (UniqueName: \"kubernetes.io/projected/de57f174-daf9-483d-bac6-e735d25f9d64-kube-api-access-nmhwg\") pod \"glance-operator-controller-manager-77987cd8cd-dz7tw\" (UID: \"de57f174-daf9-483d-bac6-e735d25f9d64\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-dz7tw" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.180386 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrh69\" (UniqueName: \"kubernetes.io/projected/f72fc870-291d-4800-a316-22de56b2ebbd-kube-api-access-wrh69\") pod \"barbican-operator-controller-manager-7d9dfd778-8tkh8\" (UID: \"f72fc870-291d-4800-a316-22de56b2ebbd\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-8tkh8" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.180426 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcdz4\" (UniqueName: \"kubernetes.io/projected/812edfc0-b0b7-40c7-913d-b176bd6817f3-kube-api-access-bcdz4\") pod \"cinder-operator-controller-manager-859b6ccc6-jc6cp\" (UID: \"812edfc0-b0b7-40c7-913d-b176bd6817f3\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jc6cp" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.180648 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gplp6"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.192596 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.198197 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-qpdqt"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.207028 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-8mm6f"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.208319 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8mm6f" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.213648 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-rj6w2" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.231173 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-8mm6f"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.262026 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-5lhcx"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.266626 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-5lhcx" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.278892 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-ftwwc" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.286915 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrh69\" (UniqueName: \"kubernetes.io/projected/f72fc870-291d-4800-a316-22de56b2ebbd-kube-api-access-wrh69\") pod \"barbican-operator-controller-manager-7d9dfd778-8tkh8\" (UID: \"f72fc870-291d-4800-a316-22de56b2ebbd\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-8tkh8" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.287190 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6kg9\" (UniqueName: \"kubernetes.io/projected/9d08e6b1-b9b9-4a7e-a859-e98f904e2588-kube-api-access-b6kg9\") pod \"keystone-operator-controller-manager-7765d96ddf-8mm6f\" (UID: \"9d08e6b1-b9b9-4a7e-a859-e98f904e2588\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8mm6f" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.287303 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert\") pod \"infra-operator-controller-manager-57548d458d-lwgs7\" (UID: \"cf3e832e-6140-4880-9efd-017837fc9990\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.287395 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcdz4\" (UniqueName: \"kubernetes.io/projected/812edfc0-b0b7-40c7-913d-b176bd6817f3-kube-api-access-bcdz4\") pod \"cinder-operator-controller-manager-859b6ccc6-jc6cp\" (UID: \"812edfc0-b0b7-40c7-913d-b176bd6817f3\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jc6cp" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.287689 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6khk\" (UniqueName: \"kubernetes.io/projected/0a36fb64-e101-44af-a6f9-91fb68fc1e7a-kube-api-access-t6khk\") pod \"heat-operator-controller-manager-5f64f6f8bb-wcvq5\" (UID: \"0a36fb64-e101-44af-a6f9-91fb68fc1e7a\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-wcvq5" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.287817 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5ttw\" (UniqueName: \"kubernetes.io/projected/15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89-kube-api-access-g5ttw\") pod \"horizon-operator-controller-manager-68c6d99b8f-gplp6\" (UID: \"15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gplp6" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.287911 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2knr\" (UniqueName: \"kubernetes.io/projected/428e69aa-23f5-4d45-8c18-65ac62c6756c-kube-api-access-r2knr\") pod \"ironic-operator-controller-manager-6c548fd776-qpdqt\" (UID: \"428e69aa-23f5-4d45-8c18-65ac62c6756c\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qpdqt" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.288012 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpb7x\" (UniqueName: \"kubernetes.io/projected/eb3d207d-118c-42b8-9e9a-103a041a44b3-kube-api-access-wpb7x\") pod \"designate-operator-controller-manager-78b4bc895b-7q6gr\" (UID: \"eb3d207d-118c-42b8-9e9a-103a041a44b3\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7q6gr" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.288089 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmhwg\" (UniqueName: \"kubernetes.io/projected/de57f174-daf9-483d-bac6-e735d25f9d64-kube-api-access-nmhwg\") pod \"glance-operator-controller-manager-77987cd8cd-dz7tw\" (UID: \"de57f174-daf9-483d-bac6-e735d25f9d64\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-dz7tw" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.288173 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqwzz\" (UniqueName: \"kubernetes.io/projected/cf3e832e-6140-4880-9efd-017837fc9990-kube-api-access-bqwzz\") pod \"infra-operator-controller-manager-57548d458d-lwgs7\" (UID: \"cf3e832e-6140-4880-9efd-017837fc9990\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.300636 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-5lhcx"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.320185 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5ttw\" (UniqueName: \"kubernetes.io/projected/15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89-kube-api-access-g5ttw\") pod \"horizon-operator-controller-manager-68c6d99b8f-gplp6\" (UID: \"15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gplp6" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.322185 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmhwg\" (UniqueName: \"kubernetes.io/projected/de57f174-daf9-483d-bac6-e735d25f9d64-kube-api-access-nmhwg\") pod \"glance-operator-controller-manager-77987cd8cd-dz7tw\" (UID: \"de57f174-daf9-483d-bac6-e735d25f9d64\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-dz7tw" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.323237 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6khk\" (UniqueName: \"kubernetes.io/projected/0a36fb64-e101-44af-a6f9-91fb68fc1e7a-kube-api-access-t6khk\") pod \"heat-operator-controller-manager-5f64f6f8bb-wcvq5\" (UID: \"0a36fb64-e101-44af-a6f9-91fb68fc1e7a\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-wcvq5" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.324190 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcdz4\" (UniqueName: \"kubernetes.io/projected/812edfc0-b0b7-40c7-913d-b176bd6817f3-kube-api-access-bcdz4\") pod \"cinder-operator-controller-manager-859b6ccc6-jc6cp\" (UID: \"812edfc0-b0b7-40c7-913d-b176bd6817f3\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jc6cp" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.333051 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpb7x\" (UniqueName: \"kubernetes.io/projected/eb3d207d-118c-42b8-9e9a-103a041a44b3-kube-api-access-wpb7x\") pod \"designate-operator-controller-manager-78b4bc895b-7q6gr\" (UID: \"eb3d207d-118c-42b8-9e9a-103a041a44b3\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7q6gr" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.333300 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.334320 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.340040 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-rzzdr" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.348442 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrh69\" (UniqueName: \"kubernetes.io/projected/f72fc870-291d-4800-a316-22de56b2ebbd-kube-api-access-wrh69\") pod \"barbican-operator-controller-manager-7d9dfd778-8tkh8\" (UID: \"f72fc870-291d-4800-a316-22de56b2ebbd\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-8tkh8" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.349263 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-8tkh8" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.360188 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rf4bw"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.361331 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rf4bw" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.369234 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-c647v" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.373231 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.384169 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7q6gr" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.384449 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rf4bw"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.390882 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2knr\" (UniqueName: \"kubernetes.io/projected/428e69aa-23f5-4d45-8c18-65ac62c6756c-kube-api-access-r2knr\") pod \"ironic-operator-controller-manager-6c548fd776-qpdqt\" (UID: \"428e69aa-23f5-4d45-8c18-65ac62c6756c\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qpdqt" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.390985 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqwzz\" (UniqueName: \"kubernetes.io/projected/cf3e832e-6140-4880-9efd-017837fc9990-kube-api-access-bqwzz\") pod \"infra-operator-controller-manager-57548d458d-lwgs7\" (UID: \"cf3e832e-6140-4880-9efd-017837fc9990\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.391014 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxddd\" (UniqueName: \"kubernetes.io/projected/f1be99ff-4068-4454-b75d-770951e9fedd-kube-api-access-bxddd\") pod \"manila-operator-controller-manager-7c79b5df47-5lhcx\" (UID: \"f1be99ff-4068-4454-b75d-770951e9fedd\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-5lhcx" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.391041 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6kg9\" (UniqueName: \"kubernetes.io/projected/9d08e6b1-b9b9-4a7e-a859-e98f904e2588-kube-api-access-b6kg9\") pod \"keystone-operator-controller-manager-7765d96ddf-8mm6f\" (UID: \"9d08e6b1-b9b9-4a7e-a859-e98f904e2588\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8mm6f" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.391058 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert\") pod \"infra-operator-controller-manager-57548d458d-lwgs7\" (UID: \"cf3e832e-6140-4880-9efd-017837fc9990\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.391078 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtbkh\" (UniqueName: \"kubernetes.io/projected/c917a8ec-2bd5-4f7b-8948-a4bed859e01f-kube-api-access-gtbkh\") pod \"mariadb-operator-controller-manager-56bbcc9d85-8xclz\" (UID: \"c917a8ec-2bd5-4f7b-8948-a4bed859e01f\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz" Dec 02 09:39:31 crc kubenswrapper[4781]: E1202 09:39:31.391674 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 09:39:31 crc kubenswrapper[4781]: E1202 09:39:31.391716 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert podName:cf3e832e-6140-4880-9efd-017837fc9990 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:31.891699827 +0000 UTC m=+1134.715573706 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert") pod "infra-operator-controller-manager-57548d458d-lwgs7" (UID: "cf3e832e-6140-4880-9efd-017837fc9990") : secret "infra-operator-webhook-server-cert" not found Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.409830 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jc6cp" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.418544 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-dz7tw" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.418783 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-chbnm"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.420083 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-chbnm" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.423269 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-dzl6z" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.425712 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2knr\" (UniqueName: \"kubernetes.io/projected/428e69aa-23f5-4d45-8c18-65ac62c6756c-kube-api-access-r2knr\") pod \"ironic-operator-controller-manager-6c548fd776-qpdqt\" (UID: \"428e69aa-23f5-4d45-8c18-65ac62c6756c\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qpdqt" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.426196 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqwzz\" (UniqueName: \"kubernetes.io/projected/cf3e832e-6140-4880-9efd-017837fc9990-kube-api-access-bqwzz\") pod \"infra-operator-controller-manager-57548d458d-lwgs7\" (UID: \"cf3e832e-6140-4880-9efd-017837fc9990\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.431493 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6kg9\" (UniqueName: \"kubernetes.io/projected/9d08e6b1-b9b9-4a7e-a859-e98f904e2588-kube-api-access-b6kg9\") pod \"keystone-operator-controller-manager-7765d96ddf-8mm6f\" (UID: \"9d08e6b1-b9b9-4a7e-a859-e98f904e2588\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8mm6f" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.436974 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-pkkst"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.437395 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-wcvq5" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.437984 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-pkkst" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.442336 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-8vb2f" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.454740 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-chbnm"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.454945 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gplp6" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.463727 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-pkkst"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.481890 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-fq4g5"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.482846 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-fq4g5" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.485690 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-2dbxg" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.487187 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qpdqt" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.495005 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtbkh\" (UniqueName: \"kubernetes.io/projected/c917a8ec-2bd5-4f7b-8948-a4bed859e01f-kube-api-access-gtbkh\") pod \"mariadb-operator-controller-manager-56bbcc9d85-8xclz\" (UID: \"c917a8ec-2bd5-4f7b-8948-a4bed859e01f\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.495093 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksmf8\" (UniqueName: \"kubernetes.io/projected/bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff-kube-api-access-ksmf8\") pod \"octavia-operator-controller-manager-998648c74-pkkst\" (UID: \"bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-pkkst" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.495156 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdlhr\" (UniqueName: \"kubernetes.io/projected/7b8c261d-133c-4a73-9424-3233e6701fff-kube-api-access-sdlhr\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-rf4bw\" (UID: \"7b8c261d-133c-4a73-9424-3233e6701fff\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rf4bw" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.495184 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxddd\" (UniqueName: \"kubernetes.io/projected/f1be99ff-4068-4454-b75d-770951e9fedd-kube-api-access-bxddd\") pod \"manila-operator-controller-manager-7c79b5df47-5lhcx\" (UID: \"f1be99ff-4068-4454-b75d-770951e9fedd\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-5lhcx" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.495216 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpl62\" (UniqueName: \"kubernetes.io/projected/71cfd08b-278c-4f9c-b0fd-198c662ef00d-kube-api-access-kpl62\") pod \"nova-operator-controller-manager-697bc559fc-chbnm\" (UID: \"71cfd08b-278c-4f9c-b0fd-198c662ef00d\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-chbnm" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.500481 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-fq4g5"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.528612 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtbkh\" (UniqueName: \"kubernetes.io/projected/c917a8ec-2bd5-4f7b-8948-a4bed859e01f-kube-api-access-gtbkh\") pod \"mariadb-operator-controller-manager-56bbcc9d85-8xclz\" (UID: \"c917a8ec-2bd5-4f7b-8948-a4bed859e01f\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.530203 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxddd\" (UniqueName: \"kubernetes.io/projected/f1be99ff-4068-4454-b75d-770951e9fedd-kube-api-access-bxddd\") pod \"manila-operator-controller-manager-7c79b5df47-5lhcx\" (UID: \"f1be99ff-4068-4454-b75d-770951e9fedd\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-5lhcx" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.541143 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8mm6f" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.541284 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.551072 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-8rtwl"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.551402 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.552365 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-68mk9"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.553117 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.553134 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-8rtwl"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.553151 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-68mk9"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.553218 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-68mk9" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.556454 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-8rtwl" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.568383 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xtks6"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.569463 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xtks6"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.569539 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xtks6" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.583409 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-sgx5j" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.583656 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-jng2c" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.583770 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.583892 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-nbtqj" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.584119 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-crjp5" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.584212 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-5d9tj"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.585251 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5d9tj" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.589469 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-5d9tj"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.595175 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-qbch7" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.595837 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxq96\" (UniqueName: \"kubernetes.io/projected/cda1cc86-51ab-4070-96e9-98adba5d51c3-kube-api-access-pxq96\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t\" (UID: \"cda1cc86-51ab-4070-96e9-98adba5d51c3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.595871 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdlhr\" (UniqueName: \"kubernetes.io/projected/7b8c261d-133c-4a73-9424-3233e6701fff-kube-api-access-sdlhr\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-rf4bw\" (UID: \"7b8c261d-133c-4a73-9424-3233e6701fff\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rf4bw" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.595895 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t\" (UID: \"cda1cc86-51ab-4070-96e9-98adba5d51c3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.595923 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpl62\" (UniqueName: \"kubernetes.io/projected/71cfd08b-278c-4f9c-b0fd-198c662ef00d-kube-api-access-kpl62\") pod \"nova-operator-controller-manager-697bc559fc-chbnm\" (UID: \"71cfd08b-278c-4f9c-b0fd-198c662ef00d\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-chbnm" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.596013 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksmf8\" (UniqueName: \"kubernetes.io/projected/bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff-kube-api-access-ksmf8\") pod \"octavia-operator-controller-manager-998648c74-pkkst\" (UID: \"bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-pkkst" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.596033 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82dv8\" (UniqueName: \"kubernetes.io/projected/22dc2882-6dfd-4f1c-90ee-06ac4c9e0aa5-kube-api-access-82dv8\") pod \"swift-operator-controller-manager-5f8c65bbfc-68mk9\" (UID: \"22dc2882-6dfd-4f1c-90ee-06ac4c9e0aa5\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-68mk9" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.596078 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vvcf\" (UniqueName: \"kubernetes.io/projected/15aa56a1-c9d3-4e48-a0fe-19e593320728-kube-api-access-2vvcf\") pod \"ovn-operator-controller-manager-b6456fdb6-fq4g5\" (UID: \"15aa56a1-c9d3-4e48-a0fe-19e593320728\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-fq4g5" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.596094 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dntk6\" (UniqueName: \"kubernetes.io/projected/ca4903e5-bed6-47c2-82a5-4376b162ec96-kube-api-access-dntk6\") pod \"placement-operator-controller-manager-78f8948974-8rtwl\" (UID: \"ca4903e5-bed6-47c2-82a5-4376b162ec96\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-8rtwl" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.615975 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-5lhcx" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.627833 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksmf8\" (UniqueName: \"kubernetes.io/projected/bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff-kube-api-access-ksmf8\") pod \"octavia-operator-controller-manager-998648c74-pkkst\" (UID: \"bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-pkkst" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.628183 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpl62\" (UniqueName: \"kubernetes.io/projected/71cfd08b-278c-4f9c-b0fd-198c662ef00d-kube-api-access-kpl62\") pod \"nova-operator-controller-manager-697bc559fc-chbnm\" (UID: \"71cfd08b-278c-4f9c-b0fd-198c662ef00d\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-chbnm" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.683642 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdlhr\" (UniqueName: \"kubernetes.io/projected/7b8c261d-133c-4a73-9424-3233e6701fff-kube-api-access-sdlhr\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-rf4bw\" (UID: \"7b8c261d-133c-4a73-9424-3233e6701fff\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rf4bw" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.703489 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vvcf\" (UniqueName: \"kubernetes.io/projected/15aa56a1-c9d3-4e48-a0fe-19e593320728-kube-api-access-2vvcf\") pod \"ovn-operator-controller-manager-b6456fdb6-fq4g5\" (UID: \"15aa56a1-c9d3-4e48-a0fe-19e593320728\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-fq4g5" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.703544 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dntk6\" (UniqueName: \"kubernetes.io/projected/ca4903e5-bed6-47c2-82a5-4376b162ec96-kube-api-access-dntk6\") pod \"placement-operator-controller-manager-78f8948974-8rtwl\" (UID: \"ca4903e5-bed6-47c2-82a5-4376b162ec96\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-8rtwl" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.703578 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxq96\" (UniqueName: \"kubernetes.io/projected/cda1cc86-51ab-4070-96e9-98adba5d51c3-kube-api-access-pxq96\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t\" (UID: \"cda1cc86-51ab-4070-96e9-98adba5d51c3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.703613 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t\" (UID: \"cda1cc86-51ab-4070-96e9-98adba5d51c3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.703667 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bksj6\" (UniqueName: \"kubernetes.io/projected/1c5a954b-d61b-4d33-a043-407f8de059a6-kube-api-access-bksj6\") pod \"test-operator-controller-manager-5854674fcc-5d9tj\" (UID: \"1c5a954b-d61b-4d33-a043-407f8de059a6\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-5d9tj" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.703702 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk6l5\" (UniqueName: \"kubernetes.io/projected/cfd47a1e-a773-4479-9656-abb353f87fe9-kube-api-access-zk6l5\") pod \"telemetry-operator-controller-manager-76cc84c6bb-xtks6\" (UID: \"cfd47a1e-a773-4479-9656-abb353f87fe9\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xtks6" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.703743 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82dv8\" (UniqueName: \"kubernetes.io/projected/22dc2882-6dfd-4f1c-90ee-06ac4c9e0aa5-kube-api-access-82dv8\") pod \"swift-operator-controller-manager-5f8c65bbfc-68mk9\" (UID: \"22dc2882-6dfd-4f1c-90ee-06ac4c9e0aa5\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-68mk9" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.722040 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-4dtmh"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.722487 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-chbnm" Dec 02 09:39:31 crc kubenswrapper[4781]: E1202 09:39:31.723049 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 09:39:31 crc kubenswrapper[4781]: E1202 09:39:31.723102 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert podName:cda1cc86-51ab-4070-96e9-98adba5d51c3 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:32.223084625 +0000 UTC m=+1135.046958504 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" (UID: "cda1cc86-51ab-4070-96e9-98adba5d51c3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.729894 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4dtmh" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.736046 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxq96\" (UniqueName: \"kubernetes.io/projected/cda1cc86-51ab-4070-96e9-98adba5d51c3-kube-api-access-pxq96\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t\" (UID: \"cda1cc86-51ab-4070-96e9-98adba5d51c3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.744691 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dntk6\" (UniqueName: \"kubernetes.io/projected/ca4903e5-bed6-47c2-82a5-4376b162ec96-kube-api-access-dntk6\") pod \"placement-operator-controller-manager-78f8948974-8rtwl\" (UID: \"ca4903e5-bed6-47c2-82a5-4376b162ec96\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-8rtwl" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.745461 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-m2ftz" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.748853 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-pkkst" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.757218 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vvcf\" (UniqueName: \"kubernetes.io/projected/15aa56a1-c9d3-4e48-a0fe-19e593320728-kube-api-access-2vvcf\") pod \"ovn-operator-controller-manager-b6456fdb6-fq4g5\" (UID: \"15aa56a1-c9d3-4e48-a0fe-19e593320728\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-fq4g5" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.789395 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82dv8\" (UniqueName: \"kubernetes.io/projected/22dc2882-6dfd-4f1c-90ee-06ac4c9e0aa5-kube-api-access-82dv8\") pod \"swift-operator-controller-manager-5f8c65bbfc-68mk9\" (UID: \"22dc2882-6dfd-4f1c-90ee-06ac4c9e0aa5\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-68mk9" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.789622 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.792270 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-fq4g5" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.796240 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rf4bw" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.825469 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6lxv\" (UniqueName: \"kubernetes.io/projected/7f59a11b-4502-4c7b-94e7-3fbb6bac2222-kube-api-access-p6lxv\") pod \"watcher-operator-controller-manager-769dc69bc-4dtmh\" (UID: \"7f59a11b-4502-4c7b-94e7-3fbb6bac2222\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4dtmh" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.825600 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bksj6\" (UniqueName: \"kubernetes.io/projected/1c5a954b-d61b-4d33-a043-407f8de059a6-kube-api-access-bksj6\") pod \"test-operator-controller-manager-5854674fcc-5d9tj\" (UID: \"1c5a954b-d61b-4d33-a043-407f8de059a6\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-5d9tj" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.825642 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk6l5\" (UniqueName: \"kubernetes.io/projected/cfd47a1e-a773-4479-9656-abb353f87fe9-kube-api-access-zk6l5\") pod \"telemetry-operator-controller-manager-76cc84c6bb-xtks6\" (UID: \"cfd47a1e-a773-4479-9656-abb353f87fe9\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xtks6" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.894213 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-4dtmh"] Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.910363 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bksj6\" (UniqueName: \"kubernetes.io/projected/1c5a954b-d61b-4d33-a043-407f8de059a6-kube-api-access-bksj6\") pod \"test-operator-controller-manager-5854674fcc-5d9tj\" (UID: \"1c5a954b-d61b-4d33-a043-407f8de059a6\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-5d9tj" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.932118 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6lxv\" (UniqueName: \"kubernetes.io/projected/7f59a11b-4502-4c7b-94e7-3fbb6bac2222-kube-api-access-p6lxv\") pod \"watcher-operator-controller-manager-769dc69bc-4dtmh\" (UID: \"7f59a11b-4502-4c7b-94e7-3fbb6bac2222\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4dtmh" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.932203 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert\") pod \"infra-operator-controller-manager-57548d458d-lwgs7\" (UID: \"cf3e832e-6140-4880-9efd-017837fc9990\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" Dec 02 09:39:31 crc kubenswrapper[4781]: E1202 09:39:31.932337 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 09:39:31 crc kubenswrapper[4781]: E1202 09:39:31.932375 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert podName:cf3e832e-6140-4880-9efd-017837fc9990 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:32.932361301 +0000 UTC m=+1135.756235180 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert") pod "infra-operator-controller-manager-57548d458d-lwgs7" (UID: "cf3e832e-6140-4880-9efd-017837fc9990") : secret "infra-operator-webhook-server-cert" not found Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.933731 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk6l5\" (UniqueName: \"kubernetes.io/projected/cfd47a1e-a773-4479-9656-abb353f87fe9-kube-api-access-zk6l5\") pod \"telemetry-operator-controller-manager-76cc84c6bb-xtks6\" (UID: \"cfd47a1e-a773-4479-9656-abb353f87fe9\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xtks6" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.979713 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-8rtwl" Dec 02 09:39:31 crc kubenswrapper[4781]: I1202 09:39:31.995593 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6lxv\" (UniqueName: \"kubernetes.io/projected/7f59a11b-4502-4c7b-94e7-3fbb6bac2222-kube-api-access-p6lxv\") pod \"watcher-operator-controller-manager-769dc69bc-4dtmh\" (UID: \"7f59a11b-4502-4c7b-94e7-3fbb6bac2222\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4dtmh" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.013672 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-68mk9" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.070719 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5d9tj" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.070847 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4dtmh" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.076413 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-686764c46-54r7r"] Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.078133 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.082749 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-686764c46-54r7r"] Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.086251 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.086281 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.086573 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-p4qqd" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.090265 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gsfd"] Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.091157 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gsfd" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.092616 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gsfd"] Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.096275 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-mg4wl" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.134749 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph4dr\" (UniqueName: \"kubernetes.io/projected/8c93d2b8-28d1-4ea7-86aa-fe3f8ac35e61-kube-api-access-ph4dr\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7gsfd\" (UID: \"8c93d2b8-28d1-4ea7-86aa-fe3f8ac35e61\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gsfd" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.135059 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm47n\" (UniqueName: \"kubernetes.io/projected/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-kube-api-access-pm47n\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.135089 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.135112 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.231958 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xtks6" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.236050 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm47n\" (UniqueName: \"kubernetes.io/projected/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-kube-api-access-pm47n\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.236092 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.236119 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.236165 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph4dr\" (UniqueName: \"kubernetes.io/projected/8c93d2b8-28d1-4ea7-86aa-fe3f8ac35e61-kube-api-access-ph4dr\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7gsfd\" (UID: \"8c93d2b8-28d1-4ea7-86aa-fe3f8ac35e61\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gsfd" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.236194 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t\" (UID: \"cda1cc86-51ab-4070-96e9-98adba5d51c3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" Dec 02 09:39:32 crc kubenswrapper[4781]: E1202 09:39:32.236305 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 09:39:32 crc kubenswrapper[4781]: E1202 09:39:32.236346 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert podName:cda1cc86-51ab-4070-96e9-98adba5d51c3 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:33.236333368 +0000 UTC m=+1136.060207247 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" (UID: "cda1cc86-51ab-4070-96e9-98adba5d51c3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 09:39:32 crc kubenswrapper[4781]: E1202 09:39:32.236913 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 09:39:32 crc kubenswrapper[4781]: E1202 09:39:32.237020 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs podName:090bc2d1-e1c5-4721-80ab-e20d4f3942c6 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:32.737006796 +0000 UTC m=+1135.560880675 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs") pod "openstack-operator-controller-manager-686764c46-54r7r" (UID: "090bc2d1-e1c5-4721-80ab-e20d4f3942c6") : secret "metrics-server-cert" not found Dec 02 09:39:32 crc kubenswrapper[4781]: E1202 09:39:32.237065 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 09:39:32 crc kubenswrapper[4781]: E1202 09:39:32.237092 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs podName:090bc2d1-e1c5-4721-80ab-e20d4f3942c6 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:32.737085748 +0000 UTC m=+1135.560959627 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs") pod "openstack-operator-controller-manager-686764c46-54r7r" (UID: "090bc2d1-e1c5-4721-80ab-e20d4f3942c6") : secret "webhook-server-cert" not found Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.264676 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph4dr\" (UniqueName: \"kubernetes.io/projected/8c93d2b8-28d1-4ea7-86aa-fe3f8ac35e61-kube-api-access-ph4dr\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7gsfd\" (UID: \"8c93d2b8-28d1-4ea7-86aa-fe3f8ac35e61\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gsfd" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.274627 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm47n\" (UniqueName: \"kubernetes.io/projected/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-kube-api-access-pm47n\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.276533 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-8tkh8"] Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.359610 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.466298 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gsfd" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.748057 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.748385 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:32 crc kubenswrapper[4781]: E1202 09:39:32.748584 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 09:39:32 crc kubenswrapper[4781]: E1202 09:39:32.748631 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs podName:090bc2d1-e1c5-4721-80ab-e20d4f3942c6 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:33.748617375 +0000 UTC m=+1136.572491254 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs") pod "openstack-operator-controller-manager-686764c46-54r7r" (UID: "090bc2d1-e1c5-4721-80ab-e20d4f3942c6") : secret "metrics-server-cert" not found Dec 02 09:39:32 crc kubenswrapper[4781]: E1202 09:39:32.749479 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 09:39:32 crc kubenswrapper[4781]: E1202 09:39:32.749516 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs podName:090bc2d1-e1c5-4721-80ab-e20d4f3942c6 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:33.749507779 +0000 UTC m=+1136.573381658 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs") pod "openstack-operator-controller-manager-686764c46-54r7r" (UID: "090bc2d1-e1c5-4721-80ab-e20d4f3942c6") : secret "webhook-server-cert" not found Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.820695 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-7q6gr"] Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.951260 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert\") pod \"infra-operator-controller-manager-57548d458d-lwgs7\" (UID: \"cf3e832e-6140-4880-9efd-017837fc9990\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" Dec 02 09:39:32 crc kubenswrapper[4781]: E1202 09:39:32.951427 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 09:39:32 crc kubenswrapper[4781]: E1202 09:39:32.951513 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert podName:cf3e832e-6140-4880-9efd-017837fc9990 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:34.951496319 +0000 UTC m=+1137.775370198 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert") pod "infra-operator-controller-manager-57548d458d-lwgs7" (UID: "cf3e832e-6140-4880-9efd-017837fc9990") : secret "infra-operator-webhook-server-cert" not found Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.954210 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-8tkh8" event={"ID":"f72fc870-291d-4800-a316-22de56b2ebbd","Type":"ContainerStarted","Data":"a92c318dd051d042e88fa707f384080cc6ef9f66a6802e718e8a9d5979421d9b"} Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.955296 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7q6gr" event={"ID":"eb3d207d-118c-42b8-9e9a-103a041a44b3","Type":"ContainerStarted","Data":"7560676826e065b24f8ed079a6ccba801afe1adfa702f83e926581d8a2bcbd79"} Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.982791 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-chbnm"] Dec 02 09:39:32 crc kubenswrapper[4781]: W1202 09:39:32.987190 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71cfd08b_278c_4f9c_b0fd_198c662ef00d.slice/crio-d7b3a94963e38832f6e9b85c9e03a057c71ddec32143ea6d06c76914adc5ba00 WatchSource:0}: Error finding container d7b3a94963e38832f6e9b85c9e03a057c71ddec32143ea6d06c76914adc5ba00: Status 404 returned error can't find the container with id d7b3a94963e38832f6e9b85c9e03a057c71ddec32143ea6d06c76914adc5ba00 Dec 02 09:39:32 crc kubenswrapper[4781]: I1202 09:39:32.995014 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-wcvq5"] Dec 02 09:39:33 crc kubenswrapper[4781]: W1202 09:39:33.005226 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a36fb64_e101_44af_a6f9_91fb68fc1e7a.slice/crio-769b028f352adad7cf47bac2a106bf2af1e5067166004ff42c75b39d0bf72213 WatchSource:0}: Error finding container 769b028f352adad7cf47bac2a106bf2af1e5067166004ff42c75b39d0bf72213: Status 404 returned error can't find the container with id 769b028f352adad7cf47bac2a106bf2af1e5067166004ff42c75b39d0bf72213 Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.006142 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-jc6cp"] Dec 02 09:39:33 crc kubenswrapper[4781]: W1202 09:39:33.013325 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde57f174_daf9_483d_bac6_e735d25f9d64.slice/crio-dd446d6594c7b4ff92f21e28c1ec81d96f7c7c944c84cbb8d5d8a7920bcb8851 WatchSource:0}: Error finding container dd446d6594c7b4ff92f21e28c1ec81d96f7c7c944c84cbb8d5d8a7920bcb8851: Status 404 returned error can't find the container with id dd446d6594c7b4ff92f21e28c1ec81d96f7c7c944c84cbb8d5d8a7920bcb8851 Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.017762 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-dz7tw"] Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.024625 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-qpdqt"] Dec 02 09:39:33 crc kubenswrapper[4781]: W1202 09:39:33.027041 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod428e69aa_23f5_4d45_8c18_65ac62c6756c.slice/crio-a5ddffd3f3f02ecab31ff84cf1e4faaa54b20009d9c377acd27b7deab00868b2 WatchSource:0}: Error finding container a5ddffd3f3f02ecab31ff84cf1e4faaa54b20009d9c377acd27b7deab00868b2: Status 404 returned error can't find the container with id a5ddffd3f3f02ecab31ff84cf1e4faaa54b20009d9c377acd27b7deab00868b2 Dec 02 09:39:33 crc kubenswrapper[4781]: W1202 09:39:33.174111 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d08e6b1_b9b9_4a7e_a859_e98f904e2588.slice/crio-45cd4ccb1458a378ff2f3cb82eb4d53f8a1cc22a6fd2e5a05ab3bbbe87a57cde WatchSource:0}: Error finding container 45cd4ccb1458a378ff2f3cb82eb4d53f8a1cc22a6fd2e5a05ab3bbbe87a57cde: Status 404 returned error can't find the container with id 45cd4ccb1458a378ff2f3cb82eb4d53f8a1cc22a6fd2e5a05ab3bbbe87a57cde Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.174195 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-8mm6f"] Dec 02 09:39:33 crc kubenswrapper[4781]: W1202 09:39:33.251602 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1be99ff_4068_4454_b75d_770951e9fedd.slice/crio-a1946aacffc73fff550cb75aa937c8141eeeaa3e535dd817af9207b251670fd0 WatchSource:0}: Error finding container a1946aacffc73fff550cb75aa937c8141eeeaa3e535dd817af9207b251670fd0: Status 404 returned error can't find the container with id a1946aacffc73fff550cb75aa937c8141eeeaa3e535dd817af9207b251670fd0 Dec 02 09:39:33 crc kubenswrapper[4781]: W1202 09:39:33.252638 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22dc2882_6dfd_4f1c_90ee_06ac4c9e0aa5.slice/crio-8a6f235a57edff78102c5b589abe5a31dfeb50888ff920bf52caf177898c75e1 WatchSource:0}: Error finding container 8a6f235a57edff78102c5b589abe5a31dfeb50888ff920bf52caf177898c75e1: Status 404 returned error can't find the container with id 8a6f235a57edff78102c5b589abe5a31dfeb50888ff920bf52caf177898c75e1 Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.254840 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t\" (UID: \"cda1cc86-51ab-4070-96e9-98adba5d51c3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.255106 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.255168 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert podName:cda1cc86-51ab-4070-96e9-98adba5d51c3 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:35.255150047 +0000 UTC m=+1138.079023926 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" (UID: "cda1cc86-51ab-4070-96e9-98adba5d51c3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.272425 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-5lhcx"] Dec 02 09:39:33 crc kubenswrapper[4781]: W1202 09:39:33.287803 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbcf3ac8_e087_4a1c_b9f8_2263e15a73ff.slice/crio-a99728a28e9a8ef289443257fcec4c55202cd1c032a668063e36a3dc642bc585 WatchSource:0}: Error finding container a99728a28e9a8ef289443257fcec4c55202cd1c032a668063e36a3dc642bc585: Status 404 returned error can't find the container with id a99728a28e9a8ef289443257fcec4c55202cd1c032a668063e36a3dc642bc585 Dec 02 09:39:33 crc kubenswrapper[4781]: W1202 09:39:33.297116 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f59a11b_4502_4c7b_94e7_3fbb6bac2222.slice/crio-375ddd77def83abcdf8ab784c62c1001409e053a426ffdaea53f331f385287dd WatchSource:0}: Error finding container 375ddd77def83abcdf8ab784c62c1001409e053a426ffdaea53f331f385287dd: Status 404 returned error can't find the container with id 375ddd77def83abcdf8ab784c62c1001409e053a426ffdaea53f331f385287dd Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.300018 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gtbkh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-8xclz_openstack-operators(c917a8ec-2bd5-4f7b-8948-a4bed859e01f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.300116 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ksmf8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-pkkst_openstack-operators(bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.300219 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g5ttw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-gplp6_openstack-operators(15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.300292 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p6lxv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-4dtmh_openstack-operators(7f59a11b-4502-4c7b-94e7-3fbb6bac2222): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.303550 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-8rtwl"] Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.310344 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-68mk9"] Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.310523 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p6lxv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-4dtmh_openstack-operators(7f59a11b-4502-4c7b-94e7-3fbb6bac2222): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.310603 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gtbkh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-8xclz_openstack-operators(c917a8ec-2bd5-4f7b-8948-a4bed859e01f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.311955 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz" podUID="c917a8ec-2bd5-4f7b-8948-a4bed859e01f" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.312004 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4dtmh" podUID="7f59a11b-4502-4c7b-94e7-3fbb6bac2222" Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.314893 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gplp6"] Dec 02 09:39:33 crc kubenswrapper[4781]: W1202 09:39:33.315115 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfd47a1e_a773_4479_9656_abb353f87fe9.slice/crio-dc870eae60d5682a657a34db6980421f0897866456c7e2fdf213c289c42f916e WatchSource:0}: Error finding container dc870eae60d5682a657a34db6980421f0897866456c7e2fdf213c289c42f916e: Status 404 returned error can't find the container with id dc870eae60d5682a657a34db6980421f0897866456c7e2fdf213c289c42f916e Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.315818 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g5ttw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-gplp6_openstack-operators(15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.316866 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gplp6" podUID="15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89" Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.319818 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rf4bw"] Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.329785 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz"] Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.330659 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ksmf8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-pkkst_openstack-operators(bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.331035 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2vvcf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-fq4g5_openstack-operators(15aa56a1-c9d3-4e48-a0fe-19e593320728): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.331953 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ph4dr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-7gsfd_openstack-operators(8c93d2b8-28d1-4ea7-86aa-fe3f8ac35e61): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.332049 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-pkkst" podUID="bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.333133 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gsfd" podUID="8c93d2b8-28d1-4ea7-86aa-fe3f8ac35e61" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.336378 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2vvcf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-fq4g5_openstack-operators(15aa56a1-c9d3-4e48-a0fe-19e593320728): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.336993 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-pkkst"] Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.338530 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zk6l5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-xtks6_openstack-operators(cfd47a1e-a773-4479-9656-abb353f87fe9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.338343 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-fq4g5" podUID="15aa56a1-c9d3-4e48-a0fe-19e593320728" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.340494 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zk6l5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-xtks6_openstack-operators(cfd47a1e-a773-4479-9656-abb353f87fe9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.342982 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xtks6" podUID="cfd47a1e-a773-4479-9656-abb353f87fe9" Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.345240 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-5d9tj"] Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.350663 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-4dtmh"] Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.356009 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xtks6"] Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.361037 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gsfd"] Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.365041 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-fq4g5"] Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.765155 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.765446 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.765465 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.765547 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs podName:090bc2d1-e1c5-4721-80ab-e20d4f3942c6 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:35.765516782 +0000 UTC m=+1138.589390701 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs") pod "openstack-operator-controller-manager-686764c46-54r7r" (UID: "090bc2d1-e1c5-4721-80ab-e20d4f3942c6") : secret "webhook-server-cert" not found Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.765598 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.765656 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs podName:090bc2d1-e1c5-4721-80ab-e20d4f3942c6 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:35.765637015 +0000 UTC m=+1138.589510994 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs") pod "openstack-operator-controller-manager-686764c46-54r7r" (UID: "090bc2d1-e1c5-4721-80ab-e20d4f3942c6") : secret "metrics-server-cert" not found Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.961808 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gsfd" event={"ID":"8c93d2b8-28d1-4ea7-86aa-fe3f8ac35e61","Type":"ContainerStarted","Data":"bcda06ac34dedccd47f958855bdbdf01ba2e132a0443fc79f6fc31d81d1cc194"} Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.962963 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-68mk9" event={"ID":"22dc2882-6dfd-4f1c-90ee-06ac4c9e0aa5","Type":"ContainerStarted","Data":"8a6f235a57edff78102c5b589abe5a31dfeb50888ff920bf52caf177898c75e1"} Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.963286 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gsfd" podUID="8c93d2b8-28d1-4ea7-86aa-fe3f8ac35e61" Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.965723 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5d9tj" event={"ID":"1c5a954b-d61b-4d33-a043-407f8de059a6","Type":"ContainerStarted","Data":"415d163b05e9b2d559cffa65c8bf88f976a8a0163524448b0b3d89bd7d229ead"} Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.967352 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gplp6" event={"ID":"15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89","Type":"ContainerStarted","Data":"67427997fb508be9cdab78078c19a3fa21a7b2e99e1fb7bb90929edcdf4ea121"} Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.969328 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gplp6" podUID="15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89" Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.969502 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jc6cp" event={"ID":"812edfc0-b0b7-40c7-913d-b176bd6817f3","Type":"ContainerStarted","Data":"b9efe8b9cee0dc45102db80596e25f681d6c4a9815c6538009ac3720e5306208"} Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.970663 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rf4bw" event={"ID":"7b8c261d-133c-4a73-9424-3233e6701fff","Type":"ContainerStarted","Data":"0a84c63537d20173dfb8bfba8e3f2726346d62ecc93e0c9466def54b67213af8"} Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.972036 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4dtmh" event={"ID":"7f59a11b-4502-4c7b-94e7-3fbb6bac2222","Type":"ContainerStarted","Data":"375ddd77def83abcdf8ab784c62c1001409e053a426ffdaea53f331f385287dd"} Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.973695 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-dz7tw" event={"ID":"de57f174-daf9-483d-bac6-e735d25f9d64","Type":"ContainerStarted","Data":"dd446d6594c7b4ff92f21e28c1ec81d96f7c7c944c84cbb8d5d8a7920bcb8851"} Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.974807 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4dtmh" podUID="7f59a11b-4502-4c7b-94e7-3fbb6bac2222" Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.975308 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-8rtwl" event={"ID":"ca4903e5-bed6-47c2-82a5-4376b162ec96","Type":"ContainerStarted","Data":"0657135b5399b96c9bd9b84e6c14d32b5120d65692c743e5f4e1c1d00b56f2c1"} Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.976972 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-wcvq5" event={"ID":"0a36fb64-e101-44af-a6f9-91fb68fc1e7a","Type":"ContainerStarted","Data":"769b028f352adad7cf47bac2a106bf2af1e5067166004ff42c75b39d0bf72213"} Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.981168 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8mm6f" event={"ID":"9d08e6b1-b9b9-4a7e-a859-e98f904e2588","Type":"ContainerStarted","Data":"45cd4ccb1458a378ff2f3cb82eb4d53f8a1cc22a6fd2e5a05ab3bbbe87a57cde"} Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.982299 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qpdqt" event={"ID":"428e69aa-23f5-4d45-8c18-65ac62c6756c","Type":"ContainerStarted","Data":"a5ddffd3f3f02ecab31ff84cf1e4faaa54b20009d9c377acd27b7deab00868b2"} Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.983435 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-5lhcx" event={"ID":"f1be99ff-4068-4454-b75d-770951e9fedd","Type":"ContainerStarted","Data":"a1946aacffc73fff550cb75aa937c8141eeeaa3e535dd817af9207b251670fd0"} Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.986132 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xtks6" event={"ID":"cfd47a1e-a773-4479-9656-abb353f87fe9","Type":"ContainerStarted","Data":"dc870eae60d5682a657a34db6980421f0897866456c7e2fdf213c289c42f916e"} Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.988091 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-pkkst" event={"ID":"bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff","Type":"ContainerStarted","Data":"a99728a28e9a8ef289443257fcec4c55202cd1c032a668063e36a3dc642bc585"} Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.988424 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xtks6" podUID="cfd47a1e-a773-4479-9656-abb353f87fe9" Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.989464 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-chbnm" event={"ID":"71cfd08b-278c-4f9c-b0fd-198c662ef00d","Type":"ContainerStarted","Data":"d7b3a94963e38832f6e9b85c9e03a057c71ddec32143ea6d06c76914adc5ba00"} Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.989673 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-pkkst" podUID="bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff" Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.990310 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz" event={"ID":"c917a8ec-2bd5-4f7b-8948-a4bed859e01f","Type":"ContainerStarted","Data":"0ce2fb85fad6a9e59e0dc0ef86c034a4d2e8348182d94648fb118492cfe0c5fd"} Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.992399 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz" podUID="c917a8ec-2bd5-4f7b-8948-a4bed859e01f" Dec 02 09:39:33 crc kubenswrapper[4781]: I1202 09:39:33.992714 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-fq4g5" event={"ID":"15aa56a1-c9d3-4e48-a0fe-19e593320728","Type":"ContainerStarted","Data":"fba48c30691229d47f6b7f41b60b7b3bc029448080b5656a76e7324bc6688e33"} Dec 02 09:39:33 crc kubenswrapper[4781]: E1202 09:39:33.993916 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-fq4g5" podUID="15aa56a1-c9d3-4e48-a0fe-19e593320728" Dec 02 09:39:34 crc kubenswrapper[4781]: I1202 09:39:34.980608 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert\") pod \"infra-operator-controller-manager-57548d458d-lwgs7\" (UID: \"cf3e832e-6140-4880-9efd-017837fc9990\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" Dec 02 09:39:34 crc kubenswrapper[4781]: E1202 09:39:34.980820 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 09:39:34 crc kubenswrapper[4781]: E1202 09:39:34.980897 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert podName:cf3e832e-6140-4880-9efd-017837fc9990 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:38.980880164 +0000 UTC m=+1141.804754043 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert") pod "infra-operator-controller-manager-57548d458d-lwgs7" (UID: "cf3e832e-6140-4880-9efd-017837fc9990") : secret "infra-operator-webhook-server-cert" not found Dec 02 09:39:35 crc kubenswrapper[4781]: E1202 09:39:35.002326 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gsfd" podUID="8c93d2b8-28d1-4ea7-86aa-fe3f8ac35e61" Dec 02 09:39:35 crc kubenswrapper[4781]: E1202 09:39:35.002487 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gplp6" podUID="15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89" Dec 02 09:39:35 crc kubenswrapper[4781]: E1202 09:39:35.002599 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-pkkst" podUID="bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff" Dec 02 09:39:35 crc kubenswrapper[4781]: E1202 09:39:35.002768 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xtks6" podUID="cfd47a1e-a773-4479-9656-abb353f87fe9" Dec 02 09:39:35 crc kubenswrapper[4781]: E1202 09:39:35.002849 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz" podUID="c917a8ec-2bd5-4f7b-8948-a4bed859e01f" Dec 02 09:39:35 crc kubenswrapper[4781]: E1202 09:39:35.002968 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-fq4g5" podUID="15aa56a1-c9d3-4e48-a0fe-19e593320728" Dec 02 09:39:35 crc kubenswrapper[4781]: E1202 09:39:35.002986 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4dtmh" podUID="7f59a11b-4502-4c7b-94e7-3fbb6bac2222" Dec 02 09:39:35 crc kubenswrapper[4781]: I1202 09:39:35.284447 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t\" (UID: \"cda1cc86-51ab-4070-96e9-98adba5d51c3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" Dec 02 09:39:35 crc kubenswrapper[4781]: E1202 09:39:35.284646 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 09:39:35 crc kubenswrapper[4781]: E1202 09:39:35.284720 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert podName:cda1cc86-51ab-4070-96e9-98adba5d51c3 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:39.284701626 +0000 UTC m=+1142.108575505 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" (UID: "cda1cc86-51ab-4070-96e9-98adba5d51c3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 09:39:35 crc kubenswrapper[4781]: I1202 09:39:35.795136 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:35 crc kubenswrapper[4781]: I1202 09:39:35.795202 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:35 crc kubenswrapper[4781]: E1202 09:39:35.795333 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 09:39:35 crc kubenswrapper[4781]: E1202 09:39:35.795385 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 09:39:35 crc kubenswrapper[4781]: E1202 09:39:35.795419 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs podName:090bc2d1-e1c5-4721-80ab-e20d4f3942c6 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:39.795398181 +0000 UTC m=+1142.619272120 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs") pod "openstack-operator-controller-manager-686764c46-54r7r" (UID: "090bc2d1-e1c5-4721-80ab-e20d4f3942c6") : secret "webhook-server-cert" not found Dec 02 09:39:35 crc kubenswrapper[4781]: E1202 09:39:35.795440 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs podName:090bc2d1-e1c5-4721-80ab-e20d4f3942c6 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:39.795434032 +0000 UTC m=+1142.619307911 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs") pod "openstack-operator-controller-manager-686764c46-54r7r" (UID: "090bc2d1-e1c5-4721-80ab-e20d4f3942c6") : secret "metrics-server-cert" not found Dec 02 09:39:39 crc kubenswrapper[4781]: I1202 09:39:39.061337 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert\") pod \"infra-operator-controller-manager-57548d458d-lwgs7\" (UID: \"cf3e832e-6140-4880-9efd-017837fc9990\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" Dec 02 09:39:39 crc kubenswrapper[4781]: E1202 09:39:39.061513 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 09:39:39 crc kubenswrapper[4781]: E1202 09:39:39.061953 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert podName:cf3e832e-6140-4880-9efd-017837fc9990 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:47.061913656 +0000 UTC m=+1149.885787575 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert") pod "infra-operator-controller-manager-57548d458d-lwgs7" (UID: "cf3e832e-6140-4880-9efd-017837fc9990") : secret "infra-operator-webhook-server-cert" not found Dec 02 09:39:39 crc kubenswrapper[4781]: I1202 09:39:39.365579 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t\" (UID: \"cda1cc86-51ab-4070-96e9-98adba5d51c3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" Dec 02 09:39:39 crc kubenswrapper[4781]: E1202 09:39:39.365766 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 09:39:39 crc kubenswrapper[4781]: E1202 09:39:39.366054 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert podName:cda1cc86-51ab-4070-96e9-98adba5d51c3 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:47.366038357 +0000 UTC m=+1150.189912236 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" (UID: "cda1cc86-51ab-4070-96e9-98adba5d51c3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 09:39:39 crc kubenswrapper[4781]: I1202 09:39:39.873394 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:39 crc kubenswrapper[4781]: I1202 09:39:39.873447 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:39 crc kubenswrapper[4781]: E1202 09:39:39.873568 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 09:39:39 crc kubenswrapper[4781]: E1202 09:39:39.873677 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs podName:090bc2d1-e1c5-4721-80ab-e20d4f3942c6 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:47.873651098 +0000 UTC m=+1150.697524997 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs") pod "openstack-operator-controller-manager-686764c46-54r7r" (UID: "090bc2d1-e1c5-4721-80ab-e20d4f3942c6") : secret "webhook-server-cert" not found Dec 02 09:39:39 crc kubenswrapper[4781]: E1202 09:39:39.873588 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 09:39:39 crc kubenswrapper[4781]: E1202 09:39:39.873751 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs podName:090bc2d1-e1c5-4721-80ab-e20d4f3942c6 nodeName:}" failed. No retries permitted until 2025-12-02 09:39:47.873734751 +0000 UTC m=+1150.697608690 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs") pod "openstack-operator-controller-manager-686764c46-54r7r" (UID: "090bc2d1-e1c5-4721-80ab-e20d4f3942c6") : secret "metrics-server-cert" not found Dec 02 09:39:47 crc kubenswrapper[4781]: I1202 09:39:47.102235 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert\") pod \"infra-operator-controller-manager-57548d458d-lwgs7\" (UID: \"cf3e832e-6140-4880-9efd-017837fc9990\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" Dec 02 09:39:47 crc kubenswrapper[4781]: E1202 09:39:47.102462 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 09:39:47 crc kubenswrapper[4781]: E1202 09:39:47.102883 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert podName:cf3e832e-6140-4880-9efd-017837fc9990 nodeName:}" failed. No retries permitted until 2025-12-02 09:40:03.102861577 +0000 UTC m=+1165.926735456 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert") pod "infra-operator-controller-manager-57548d458d-lwgs7" (UID: "cf3e832e-6140-4880-9efd-017837fc9990") : secret "infra-operator-webhook-server-cert" not found Dec 02 09:39:47 crc kubenswrapper[4781]: I1202 09:39:47.406895 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t\" (UID: \"cda1cc86-51ab-4070-96e9-98adba5d51c3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" Dec 02 09:39:47 crc kubenswrapper[4781]: I1202 09:39:47.415748 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cda1cc86-51ab-4070-96e9-98adba5d51c3-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t\" (UID: \"cda1cc86-51ab-4070-96e9-98adba5d51c3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" Dec 02 09:39:47 crc kubenswrapper[4781]: I1202 09:39:47.500134 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-sgx5j" Dec 02 09:39:47 crc kubenswrapper[4781]: I1202 09:39:47.508589 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" Dec 02 09:39:47 crc kubenswrapper[4781]: E1202 09:39:47.812784 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 02 09:39:47 crc kubenswrapper[4781]: E1202 09:39:47.813391 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kpl62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-chbnm_openstack-operators(71cfd08b-278c-4f9c-b0fd-198c662ef00d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:39:47 crc kubenswrapper[4781]: I1202 09:39:47.913740 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:47 crc kubenswrapper[4781]: I1202 09:39:47.913802 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:47 crc kubenswrapper[4781]: I1202 09:39:47.920906 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-metrics-certs\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:47 crc kubenswrapper[4781]: I1202 09:39:47.932602 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/090bc2d1-e1c5-4721-80ab-e20d4f3942c6-webhook-certs\") pod \"openstack-operator-controller-manager-686764c46-54r7r\" (UID: \"090bc2d1-e1c5-4721-80ab-e20d4f3942c6\") " pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:48 crc kubenswrapper[4781]: I1202 09:39:48.059591 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-p4qqd" Dec 02 09:39:48 crc kubenswrapper[4781]: I1202 09:39:48.069522 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:48 crc kubenswrapper[4781]: I1202 09:39:48.432400 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t"] Dec 02 09:39:48 crc kubenswrapper[4781]: I1202 09:39:48.467293 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-686764c46-54r7r"] Dec 02 09:39:48 crc kubenswrapper[4781]: E1202 09:39:48.541170 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bksj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-5d9tj_openstack-operators(1c5a954b-d61b-4d33-a043-407f8de059a6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:48 crc kubenswrapper[4781]: E1202 09:39:48.541557 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bxddd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7c79b5df47-5lhcx_openstack-operators(f1be99ff-4068-4454-b75d-770951e9fedd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:48 crc kubenswrapper[4781]: E1202 09:39:48.541624 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sdlhr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-rf4bw_openstack-operators(7b8c261d-133c-4a73-9424-3233e6701fff): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:48 crc kubenswrapper[4781]: E1202 09:39:48.541705 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b6kg9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-8mm6f_openstack-operators(9d08e6b1-b9b9-4a7e-a859-e98f904e2588): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:48 crc kubenswrapper[4781]: E1202 09:39:48.541781 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wrh69,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-8tkh8_openstack-operators(f72fc870-291d-4800-a316-22de56b2ebbd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 09:39:48 crc kubenswrapper[4781]: E1202 09:39:48.545007 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-8tkh8" podUID="f72fc870-291d-4800-a316-22de56b2ebbd" Dec 02 09:39:48 crc kubenswrapper[4781]: E1202 09:39:48.545067 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5d9tj" podUID="1c5a954b-d61b-4d33-a043-407f8de059a6" Dec 02 09:39:48 crc kubenswrapper[4781]: E1202 09:39:48.545088 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-5lhcx" podUID="f1be99ff-4068-4454-b75d-770951e9fedd" Dec 02 09:39:48 crc kubenswrapper[4781]: E1202 09:39:48.545105 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rf4bw" podUID="7b8c261d-133c-4a73-9424-3233e6701fff" Dec 02 09:39:48 crc kubenswrapper[4781]: E1202 09:39:48.545121 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8mm6f" podUID="9d08e6b1-b9b9-4a7e-a859-e98f904e2588" Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.125747 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" event={"ID":"cda1cc86-51ab-4070-96e9-98adba5d51c3","Type":"ContainerStarted","Data":"9ea705afacaf3fc17b69b2a253cfdb7ffecaea0c720f895414c6fb02da2b8fcd"} Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.144510 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rf4bw" event={"ID":"7b8c261d-133c-4a73-9424-3233e6701fff","Type":"ContainerStarted","Data":"5cf85b5df96a2094915aa129f1fa02425166a1cff8c677670c1fcb8f786e880b"} Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.145536 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rf4bw" Dec 02 09:39:49 crc kubenswrapper[4781]: E1202 09:39:49.148300 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rf4bw" podUID="7b8c261d-133c-4a73-9424-3233e6701fff" Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.161468 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-5lhcx" event={"ID":"f1be99ff-4068-4454-b75d-770951e9fedd","Type":"ContainerStarted","Data":"3ea0bd488474e8a6f5b6929186bd0dc054f24d051b0ec0b5c3179cbc6f0ac8c7"} Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.162012 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-5lhcx" Dec 02 09:39:49 crc kubenswrapper[4781]: E1202 09:39:49.163501 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-5lhcx" podUID="f1be99ff-4068-4454-b75d-770951e9fedd" Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.166171 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-dz7tw" event={"ID":"de57f174-daf9-483d-bac6-e735d25f9d64","Type":"ContainerStarted","Data":"b5c681bdbb0341100217fa3b90f0511671a421ac2d84b7e42be1681c80fc1006"} Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.182122 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qpdqt" event={"ID":"428e69aa-23f5-4d45-8c18-65ac62c6756c","Type":"ContainerStarted","Data":"aa827a3c6a009acaf08681843d95300e80ab71ce0b5c98af8468bda29970c5a5"} Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.192979 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-wcvq5" event={"ID":"0a36fb64-e101-44af-a6f9-91fb68fc1e7a","Type":"ContainerStarted","Data":"805e5cecc7c7a84d08755106cad6880270aa699343e85cfe98fabb193572d4b2"} Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.224168 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5d9tj" event={"ID":"1c5a954b-d61b-4d33-a043-407f8de059a6","Type":"ContainerStarted","Data":"45fbf90b649391da10425db20ac2da2cc22ab3352ea6699b52d8d6d4761c95bd"} Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.224891 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5d9tj" Dec 02 09:39:49 crc kubenswrapper[4781]: E1202 09:39:49.231991 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5d9tj" podUID="1c5a954b-d61b-4d33-a043-407f8de059a6" Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.237730 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7q6gr" event={"ID":"eb3d207d-118c-42b8-9e9a-103a041a44b3","Type":"ContainerStarted","Data":"96beb45946aefdf0f772e68f6cc7e69e56dcd589b3ed27d1a6c90f2da688a83d"} Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.253873 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-68mk9" event={"ID":"22dc2882-6dfd-4f1c-90ee-06ac4c9e0aa5","Type":"ContainerStarted","Data":"43587426e7efee20748c2319cf794789390ca40d3839a650f458823d9b4af4cc"} Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.265593 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jc6cp" event={"ID":"812edfc0-b0b7-40c7-913d-b176bd6817f3","Type":"ContainerStarted","Data":"4cb2d255c0f1501ad1c9aca408b934464c01d84183c82b5aa50de2d017dee607"} Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.290183 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8mm6f" event={"ID":"9d08e6b1-b9b9-4a7e-a859-e98f904e2588","Type":"ContainerStarted","Data":"7a5747b786dcca618c6d4ee17b582d7c5a893312c59b3c7834cefde4e96ec5af"} Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.290886 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8mm6f" Dec 02 09:39:49 crc kubenswrapper[4781]: E1202 09:39:49.294325 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8mm6f" podUID="9d08e6b1-b9b9-4a7e-a859-e98f904e2588" Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.325994 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-8rtwl" event={"ID":"ca4903e5-bed6-47c2-82a5-4376b162ec96","Type":"ContainerStarted","Data":"8a700d87f7c19b68162596c65d1e9fbbdcd81c6fb0037c6b372df74be5add3b9"} Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.374852 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-8tkh8" event={"ID":"f72fc870-291d-4800-a316-22de56b2ebbd","Type":"ContainerStarted","Data":"6573211781fc78cea1cf41676dfd86a1fbf822d069538cdae5688d3217810353"} Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.375802 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-8tkh8" Dec 02 09:39:49 crc kubenswrapper[4781]: E1202 09:39:49.416188 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-8tkh8" podUID="f72fc870-291d-4800-a316-22de56b2ebbd" Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.433117 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" event={"ID":"090bc2d1-e1c5-4721-80ab-e20d4f3942c6","Type":"ContainerStarted","Data":"bf17a65b87ae97a811390427d26e51ab7874de58a5dfa7699aec02933a109150"} Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.433161 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" event={"ID":"090bc2d1-e1c5-4721-80ab-e20d4f3942c6","Type":"ContainerStarted","Data":"bfe2a9c0ea585afbe68fdac0a59336cb1098869a802845c3c8cb2540e3988070"} Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.433373 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:39:49 crc kubenswrapper[4781]: I1202 09:39:49.564459 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" podStartSLOduration=18.564440034 podStartE2EDuration="18.564440034s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:39:49.562367939 +0000 UTC m=+1152.386241838" watchObservedRunningTime="2025-12-02 09:39:49.564440034 +0000 UTC m=+1152.388313913" Dec 02 09:39:50 crc kubenswrapper[4781]: E1202 09:39:50.447244 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-5lhcx" podUID="f1be99ff-4068-4454-b75d-770951e9fedd" Dec 02 09:39:50 crc kubenswrapper[4781]: E1202 09:39:50.447261 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rf4bw" podUID="7b8c261d-133c-4a73-9424-3233e6701fff" Dec 02 09:39:50 crc kubenswrapper[4781]: E1202 09:39:50.447264 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8mm6f" podUID="9d08e6b1-b9b9-4a7e-a859-e98f904e2588" Dec 02 09:39:50 crc kubenswrapper[4781]: E1202 09:39:50.447322 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-8tkh8" podUID="f72fc870-291d-4800-a316-22de56b2ebbd" Dec 02 09:39:50 crc kubenswrapper[4781]: E1202 09:39:50.447332 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5d9tj" podUID="1c5a954b-d61b-4d33-a043-407f8de059a6" Dec 02 09:39:58 crc kubenswrapper[4781]: I1202 09:39:58.079765 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-686764c46-54r7r" Dec 02 09:40:00 crc kubenswrapper[4781]: I1202 09:40:00.411638 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:40:00 crc kubenswrapper[4781]: I1202 09:40:00.411692 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:40:00 crc kubenswrapper[4781]: I1202 09:40:00.501190 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5d9tj" Dec 02 09:40:01 crc kubenswrapper[4781]: I1202 09:40:01.351271 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-8tkh8" Dec 02 09:40:01 crc kubenswrapper[4781]: I1202 09:40:01.511079 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8mm6f" Dec 02 09:40:01 crc kubenswrapper[4781]: I1202 09:40:01.620083 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-5lhcx" Dec 02 09:40:01 crc kubenswrapper[4781]: I1202 09:40:01.799096 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rf4bw" Dec 02 09:40:03 crc kubenswrapper[4781]: I1202 09:40:03.174739 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert\") pod \"infra-operator-controller-manager-57548d458d-lwgs7\" (UID: \"cf3e832e-6140-4880-9efd-017837fc9990\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" Dec 02 09:40:03 crc kubenswrapper[4781]: I1202 09:40:03.181233 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf3e832e-6140-4880-9efd-017837fc9990-cert\") pod \"infra-operator-controller-manager-57548d458d-lwgs7\" (UID: \"cf3e832e-6140-4880-9efd-017837fc9990\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" Dec 02 09:40:03 crc kubenswrapper[4781]: I1202 09:40:03.276078 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-8qlgv" Dec 02 09:40:03 crc kubenswrapper[4781]: I1202 09:40:03.283572 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" Dec 02 09:40:16 crc kubenswrapper[4781]: E1202 09:40:16.135201 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:14cfad6ea2e7f7ecc4cb2aafceb9c61514b3d04b66668832d1e4ac3b19f1ab81" Dec 02 09:40:16 crc kubenswrapper[4781]: E1202 09:40:16.136656 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:14cfad6ea2e7f7ecc4cb2aafceb9c61514b3d04b66668832d1e4ac3b19f1ab81,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pxq96,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t_openstack-operators(cda1cc86-51ab-4070-96e9-98adba5d51c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:40:16 crc kubenswrapper[4781]: E1202 09:40:16.677319 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 02 09:40:16 crc kubenswrapper[4781]: E1202 09:40:16.677556 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gtbkh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-8xclz_openstack-operators(c917a8ec-2bd5-4f7b-8948-a4bed859e01f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:40:18 crc kubenswrapper[4781]: E1202 09:40:18.485663 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 02 09:40:18 crc kubenswrapper[4781]: E1202 09:40:18.486100 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ph4dr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-7gsfd_openstack-operators(8c93d2b8-28d1-4ea7-86aa-fe3f8ac35e61): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:40:18 crc kubenswrapper[4781]: E1202 09:40:18.487313 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gsfd" podUID="8c93d2b8-28d1-4ea7-86aa-fe3f8ac35e61" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.068110 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.068279 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t6khk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-wcvq5_openstack-operators(0a36fb64-e101-44af-a6f9-91fb68fc1e7a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.069496 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-wcvq5" podUID="0a36fb64-e101-44af-a6f9-91fb68fc1e7a" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.112448 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.112630 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bcdz4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-jc6cp_openstack-operators(812edfc0-b0b7-40c7-913d-b176bd6817f3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.113856 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jc6cp" podUID="812edfc0-b0b7-40c7-913d-b176bd6817f3" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.129961 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.130127 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dntk6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-8rtwl_openstack-operators(ca4903e5-bed6-47c2-82a5-4376b162ec96): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.131446 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-8rtwl" podUID="ca4903e5-bed6-47c2-82a5-4376b162ec96" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.274586 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.274748 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-82dv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-68mk9_openstack-operators(22dc2882-6dfd-4f1c-90ee-06ac4c9e0aa5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.276954 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-68mk9" podUID="22dc2882-6dfd-4f1c-90ee-06ac4c9e0aa5" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.431100 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.431575 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nmhwg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-dz7tw_openstack-operators(de57f174-daf9-483d-bac6-e735d25f9d64): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.432856 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-dz7tw" podUID="de57f174-daf9-483d-bac6-e735d25f9d64" Dec 02 09:40:19 crc kubenswrapper[4781]: I1202 09:40:19.449005 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7"] Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.508682 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.509342 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wpb7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-7q6gr_openstack-operators(eb3d207d-118c-42b8-9e9a-103a041a44b3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.511519 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7q6gr" podUID="eb3d207d-118c-42b8-9e9a-103a041a44b3" Dec 02 09:40:19 crc kubenswrapper[4781]: W1202 09:40:19.513587 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf3e832e_6140_4880_9efd_017837fc9990.slice/crio-13b362b62bd689b1abb4ee82373a6ee480f44c97cfef3322a4b2dfc8ffd92ecf WatchSource:0}: Error finding container 13b362b62bd689b1abb4ee82373a6ee480f44c97cfef3322a4b2dfc8ffd92ecf: Status 404 returned error can't find the container with id 13b362b62bd689b1abb4ee82373a6ee480f44c97cfef3322a4b2dfc8ffd92ecf Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.614626 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.614972 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r2knr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-qpdqt_openstack-operators(428e69aa-23f5-4d45-8c18-65ac62c6756c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.616218 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qpdqt" podUID="428e69aa-23f5-4d45-8c18-65ac62c6756c" Dec 02 09:40:19 crc kubenswrapper[4781]: I1202 09:40:19.660520 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" event={"ID":"cf3e832e-6140-4880-9efd-017837fc9990","Type":"ContainerStarted","Data":"13b362b62bd689b1abb4ee82373a6ee480f44c97cfef3322a4b2dfc8ffd92ecf"} Dec 02 09:40:19 crc kubenswrapper[4781]: I1202 09:40:19.671678 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gplp6" event={"ID":"15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89","Type":"ContainerStarted","Data":"073fa156132ed90362853c4c75c7dad8dd6601082a3b64eafa8606fcf14cd310"} Dec 02 09:40:19 crc kubenswrapper[4781]: I1202 09:40:19.676263 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4dtmh" event={"ID":"7f59a11b-4502-4c7b-94e7-3fbb6bac2222","Type":"ContainerStarted","Data":"75eceb99ef48d89c97c80d98b3df7cd2c9bf0730890a3b8fdc007a3c661990be"} Dec 02 09:40:19 crc kubenswrapper[4781]: I1202 09:40:19.677473 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-8rtwl" Dec 02 09:40:19 crc kubenswrapper[4781]: I1202 09:40:19.677544 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7q6gr" Dec 02 09:40:19 crc kubenswrapper[4781]: I1202 09:40:19.679150 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qpdqt" Dec 02 09:40:19 crc kubenswrapper[4781]: I1202 09:40:19.680988 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qpdqt" Dec 02 09:40:19 crc kubenswrapper[4781]: I1202 09:40:19.682365 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-8rtwl" Dec 02 09:40:19 crc kubenswrapper[4781]: I1202 09:40:19.682504 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7q6gr" Dec 02 09:40:19 crc kubenswrapper[4781]: E1202 09:40:19.929310 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz" podUID="c917a8ec-2bd5-4f7b-8948-a4bed859e01f" Dec 02 09:40:20 crc kubenswrapper[4781]: E1202 09:40:20.017527 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" podUID="cda1cc86-51ab-4070-96e9-98adba5d51c3" Dec 02 09:40:20 crc kubenswrapper[4781]: E1202 09:40:20.044046 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-chbnm" podUID="71cfd08b-278c-4f9c-b0fd-198c662ef00d" Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.690265 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-8rtwl" event={"ID":"ca4903e5-bed6-47c2-82a5-4376b162ec96","Type":"ContainerStarted","Data":"863e137904b33859d03af01daef8b9f5976651548cc1ac62cf380a5c77ff8172"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.695087 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xtks6" event={"ID":"cfd47a1e-a773-4479-9656-abb353f87fe9","Type":"ContainerStarted","Data":"20bb45da14941afa09f22f361362e5e3d93698b8f48211ef4a4e407152fb96c0"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.695414 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xtks6" event={"ID":"cfd47a1e-a773-4479-9656-abb353f87fe9","Type":"ContainerStarted","Data":"ddc8374c0e67c776e118241fa7e3f08f3f7c207ccbde6fec5378a7aa9a8f2c33"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.696099 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xtks6" Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.698135 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5d9tj" event={"ID":"1c5a954b-d61b-4d33-a043-407f8de059a6","Type":"ContainerStarted","Data":"53328bbe0a6a66f0550f197792d7638e8cbe43a07d819adfdbdd520d16f46d51"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.701962 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jc6cp" event={"ID":"812edfc0-b0b7-40c7-913d-b176bd6817f3","Type":"ContainerStarted","Data":"dd5d8260319945e6b2aa4693d4a1501bd13e132b292752150ffa16f3fbd24f88"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.702276 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jc6cp" Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.708020 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" event={"ID":"cda1cc86-51ab-4070-96e9-98adba5d51c3","Type":"ContainerStarted","Data":"b33b68b8e39927c81e1d8fd12cf31c758380cee9c59df1ae5545b2cbf5249260"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.708645 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jc6cp" Dec 02 09:40:20 crc kubenswrapper[4781]: E1202 09:40:20.711721 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:14cfad6ea2e7f7ecc4cb2aafceb9c61514b3d04b66668832d1e4ac3b19f1ab81\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" podUID="cda1cc86-51ab-4070-96e9-98adba5d51c3" Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.714901 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-8rtwl" podStartSLOduration=35.186002793 podStartE2EDuration="49.714881925s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:33.25007998 +0000 UTC m=+1136.073953859" lastFinishedPulling="2025-12-02 09:39:47.778959112 +0000 UTC m=+1150.602832991" observedRunningTime="2025-12-02 09:40:20.713063127 +0000 UTC m=+1183.536937006" watchObservedRunningTime="2025-12-02 09:40:20.714881925 +0000 UTC m=+1183.538755804" Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.716027 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4dtmh" event={"ID":"7f59a11b-4502-4c7b-94e7-3fbb6bac2222","Type":"ContainerStarted","Data":"9df439e0351497b005dc9373a89528ebb7ce18bbb454d43198527e4c12027aa6"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.716946 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4dtmh" Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.734812 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qpdqt" event={"ID":"428e69aa-23f5-4d45-8c18-65ac62c6756c","Type":"ContainerStarted","Data":"a7214cdc47a5b98d0db68f0e92e1372094d51afbdafccd4b19bc3f9996fb47a3"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.745677 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8mm6f" event={"ID":"9d08e6b1-b9b9-4a7e-a859-e98f904e2588","Type":"ContainerStarted","Data":"29da397414cf575c0e09664b67665101103a0c823f5b6e85e963e93130e61b0e"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.762133 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-wcvq5" event={"ID":"0a36fb64-e101-44af-a6f9-91fb68fc1e7a","Type":"ContainerStarted","Data":"26573efd56e066fadfcda0bc1d0b863447c5757a72270a23d04bf09f77a8d932"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.764252 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-wcvq5" Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.776351 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jc6cp" podStartSLOduration=34.813221186 podStartE2EDuration="49.776327296s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:33.012104757 +0000 UTC m=+1135.835978636" lastFinishedPulling="2025-12-02 09:39:47.975210867 +0000 UTC m=+1150.799084746" observedRunningTime="2025-12-02 09:40:20.775113473 +0000 UTC m=+1183.598987342" watchObservedRunningTime="2025-12-02 09:40:20.776327296 +0000 UTC m=+1183.600201175" Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.779022 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-wcvq5" Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.787941 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-pkkst" event={"ID":"bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff","Type":"ContainerStarted","Data":"843fd3ca8045832105c7e46ba2681efdc882043bb4b893e94d6dfe6654d9d1ae"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.788287 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-pkkst" event={"ID":"bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff","Type":"ContainerStarted","Data":"5cb0c1fd0f3b4659578eb4694aed2c41e691d81bafa2f8f61c20a219355f27b5"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.788948 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-pkkst" Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.801910 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-68mk9" event={"ID":"22dc2882-6dfd-4f1c-90ee-06ac4c9e0aa5","Type":"ContainerStarted","Data":"dbb7a9d93eef1926d5690d3b97b6ca81b43e3f9694443f41e45f524f66aadb6b"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.802689 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-68mk9" Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.816415 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7q6gr" event={"ID":"eb3d207d-118c-42b8-9e9a-103a041a44b3","Type":"ContainerStarted","Data":"d28b8c9d3d38fd1b95891c1e79f34d43bca9f50032125fce8cec6980705b7e32"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.819507 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-68mk9" Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.839216 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gplp6" event={"ID":"15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89","Type":"ContainerStarted","Data":"4078c43efc7f2fffe7f1e2d4456bb97a96283479c7ba998edc14872511f065eb"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.839992 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gplp6" Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.855531 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rf4bw" event={"ID":"7b8c261d-133c-4a73-9424-3233e6701fff","Type":"ContainerStarted","Data":"ca7678fe7352fc8ed927b72525fe43ce5d7de1bc61e681868fec6a73281ff41b"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.874544 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-5lhcx" event={"ID":"f1be99ff-4068-4454-b75d-770951e9fedd","Type":"ContainerStarted","Data":"2450b7678d881c86058728b02482b19e8a074b0b3ffafcbcc7a1ccd59c337e71"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.904475 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5d9tj" podStartSLOduration=3.793650063 podStartE2EDuration="49.904448629s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:33.299619909 +0000 UTC m=+1136.123493788" lastFinishedPulling="2025-12-02 09:40:19.410418475 +0000 UTC m=+1182.234292354" observedRunningTime="2025-12-02 09:40:20.874220332 +0000 UTC m=+1183.698094211" watchObservedRunningTime="2025-12-02 09:40:20.904448629 +0000 UTC m=+1183.728322518" Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.916543 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-8tkh8" event={"ID":"f72fc870-291d-4800-a316-22de56b2ebbd","Type":"ContainerStarted","Data":"60670f8d6749f1990de9c58bb15d63025596b7a700ff352a0f4ef9e7d4a46115"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.925368 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xtks6" podStartSLOduration=5.626786693 podStartE2EDuration="49.925342624s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:33.338432438 +0000 UTC m=+1136.162306317" lastFinishedPulling="2025-12-02 09:40:17.636988369 +0000 UTC m=+1180.460862248" observedRunningTime="2025-12-02 09:40:20.913254648 +0000 UTC m=+1183.737128527" watchObservedRunningTime="2025-12-02 09:40:20.925342624 +0000 UTC m=+1183.749216503" Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.927618 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-chbnm" event={"ID":"71cfd08b-278c-4f9c-b0fd-198c662ef00d","Type":"ContainerStarted","Data":"c5de6789a1586051a3fdbd1dba9b522a748b095ee54928c9173ca4d82ba04b34"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.959545 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-dz7tw" event={"ID":"de57f174-daf9-483d-bac6-e735d25f9d64","Type":"ContainerStarted","Data":"823fc4b02b69e87713d8a73989d06bc55d93042dc949173afa2c2eaee945bfb2"} Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.961133 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-dz7tw" Dec 02 09:40:20 crc kubenswrapper[4781]: I1202 09:40:20.989876 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-dz7tw" Dec 02 09:40:21 crc kubenswrapper[4781]: I1202 09:40:21.016103 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gplp6" podStartSLOduration=5.286295091 podStartE2EDuration="50.016073837s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:33.300085782 +0000 UTC m=+1136.123959661" lastFinishedPulling="2025-12-02 09:40:18.029864528 +0000 UTC m=+1180.853738407" observedRunningTime="2025-12-02 09:40:20.950121964 +0000 UTC m=+1183.773995853" watchObservedRunningTime="2025-12-02 09:40:21.016073837 +0000 UTC m=+1183.839947716" Dec 02 09:40:21 crc kubenswrapper[4781]: I1202 09:40:21.018656 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz" event={"ID":"c917a8ec-2bd5-4f7b-8948-a4bed859e01f","Type":"ContainerStarted","Data":"26ba6461f9adf05c6ccb028601e95c5e22b3b0ee865ff2d312359c7c3fb50a0c"} Dec 02 09:40:21 crc kubenswrapper[4781]: E1202 09:40:21.046267 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz" podUID="c917a8ec-2bd5-4f7b-8948-a4bed859e01f" Dec 02 09:40:21 crc kubenswrapper[4781]: I1202 09:40:21.054579 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-fq4g5" event={"ID":"15aa56a1-c9d3-4e48-a0fe-19e593320728","Type":"ContainerStarted","Data":"f805517a0413f3e2046c86d0417f1424c05bf15c9741f17e2cd77a90b8e36598"} Dec 02 09:40:21 crc kubenswrapper[4781]: I1202 09:40:21.054620 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-fq4g5" event={"ID":"15aa56a1-c9d3-4e48-a0fe-19e593320728","Type":"ContainerStarted","Data":"5c22c71c789901620697eafd62646854202148f23ce2bd546c05a21b8f2584df"} Dec 02 09:40:21 crc kubenswrapper[4781]: I1202 09:40:21.055332 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-fq4g5" Dec 02 09:40:21 crc kubenswrapper[4781]: I1202 09:40:21.059738 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-wcvq5" podStartSLOduration=35.108336662 podStartE2EDuration="50.059715106s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:33.007896343 +0000 UTC m=+1135.831770212" lastFinishedPulling="2025-12-02 09:39:47.959274777 +0000 UTC m=+1150.783148656" observedRunningTime="2025-12-02 09:40:20.987795872 +0000 UTC m=+1183.811669751" watchObservedRunningTime="2025-12-02 09:40:21.059715106 +0000 UTC m=+1183.883588995" Dec 02 09:40:21 crc kubenswrapper[4781]: I1202 09:40:21.133910 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qpdqt" podStartSLOduration=35.204483122 podStartE2EDuration="50.133890011s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:33.028950012 +0000 UTC m=+1135.852823891" lastFinishedPulling="2025-12-02 09:39:47.958356901 +0000 UTC m=+1150.782230780" observedRunningTime="2025-12-02 09:40:21.024469833 +0000 UTC m=+1183.848343712" watchObservedRunningTime="2025-12-02 09:40:21.133890011 +0000 UTC m=+1183.957763890" Dec 02 09:40:21 crc kubenswrapper[4781]: I1202 09:40:21.146309 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-pkkst" podStartSLOduration=5.418990776 podStartE2EDuration="50.146284586s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:33.299975319 +0000 UTC m=+1136.123849198" lastFinishedPulling="2025-12-02 09:40:18.027269129 +0000 UTC m=+1180.851143008" observedRunningTime="2025-12-02 09:40:21.064005672 +0000 UTC m=+1183.887879551" watchObservedRunningTime="2025-12-02 09:40:21.146284586 +0000 UTC m=+1183.970158465" Dec 02 09:40:21 crc kubenswrapper[4781]: I1202 09:40:21.148965 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-5lhcx" podStartSLOduration=4.014732059 podStartE2EDuration="50.148956019s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:33.272614879 +0000 UTC m=+1136.096488758" lastFinishedPulling="2025-12-02 09:40:19.406838839 +0000 UTC m=+1182.230712718" observedRunningTime="2025-12-02 09:40:21.095495214 +0000 UTC m=+1183.919369093" watchObservedRunningTime="2025-12-02 09:40:21.148956019 +0000 UTC m=+1183.972829898" Dec 02 09:40:21 crc kubenswrapper[4781]: I1202 09:40:21.161808 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8mm6f" podStartSLOduration=3.94893618 podStartE2EDuration="50.161787945s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:33.177761245 +0000 UTC m=+1136.001635124" lastFinishedPulling="2025-12-02 09:40:19.39061301 +0000 UTC m=+1182.214486889" observedRunningTime="2025-12-02 09:40:21.134549589 +0000 UTC m=+1183.958423468" watchObservedRunningTime="2025-12-02 09:40:21.161787945 +0000 UTC m=+1183.985661824" Dec 02 09:40:21 crc kubenswrapper[4781]: I1202 09:40:21.186176 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-7q6gr" podStartSLOduration=35.054592491 podStartE2EDuration="50.186154654s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:32.825600786 +0000 UTC m=+1135.649474665" lastFinishedPulling="2025-12-02 09:39:47.957162949 +0000 UTC m=+1150.781036828" observedRunningTime="2025-12-02 09:40:21.164142579 +0000 UTC m=+1183.988016458" watchObservedRunningTime="2025-12-02 09:40:21.186154654 +0000 UTC m=+1184.010028543" Dec 02 09:40:21 crc kubenswrapper[4781]: I1202 09:40:21.250118 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4dtmh" podStartSLOduration=5.521337612 podStartE2EDuration="50.250092972s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:33.300207865 +0000 UTC m=+1136.124081754" lastFinishedPulling="2025-12-02 09:40:18.028963215 +0000 UTC m=+1180.852837114" observedRunningTime="2025-12-02 09:40:21.202263879 +0000 UTC m=+1184.026137768" watchObservedRunningTime="2025-12-02 09:40:21.250092972 +0000 UTC m=+1184.073966851" Dec 02 09:40:21 crc kubenswrapper[4781]: I1202 09:40:21.275683 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rf4bw" podStartSLOduration=4.175168246 podStartE2EDuration="50.275662904s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:33.299810754 +0000 UTC m=+1136.123684643" lastFinishedPulling="2025-12-02 09:40:19.400305422 +0000 UTC m=+1182.224179301" observedRunningTime="2025-12-02 09:40:21.243822793 +0000 UTC m=+1184.067696672" watchObservedRunningTime="2025-12-02 09:40:21.275662904 +0000 UTC m=+1184.099536783" Dec 02 09:40:21 crc kubenswrapper[4781]: I1202 09:40:21.307237 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-68mk9" podStartSLOduration=35.620218119 podStartE2EDuration="50.307217306s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:33.272416253 +0000 UTC m=+1136.096290132" lastFinishedPulling="2025-12-02 09:39:47.95941544 +0000 UTC m=+1150.783289319" observedRunningTime="2025-12-02 09:40:21.27664381 +0000 UTC m=+1184.100517689" watchObservedRunningTime="2025-12-02 09:40:21.307217306 +0000 UTC m=+1184.131091185" Dec 02 09:40:21 crc kubenswrapper[4781]: I1202 09:40:21.317510 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-dz7tw" podStartSLOduration=35.374620071 podStartE2EDuration="50.317492284s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:33.015490999 +0000 UTC m=+1135.839364878" lastFinishedPulling="2025-12-02 09:39:47.958363212 +0000 UTC m=+1150.782237091" observedRunningTime="2025-12-02 09:40:21.306389895 +0000 UTC m=+1184.130263774" watchObservedRunningTime="2025-12-02 09:40:21.317492284 +0000 UTC m=+1184.141366163" Dec 02 09:40:21 crc kubenswrapper[4781]: I1202 09:40:21.357241 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-8tkh8" podStartSLOduration=3.30922934 podStartE2EDuration="50.357195398s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:32.359340193 +0000 UTC m=+1135.183214072" lastFinishedPulling="2025-12-02 09:40:19.407306251 +0000 UTC m=+1182.231180130" observedRunningTime="2025-12-02 09:40:21.33175346 +0000 UTC m=+1184.155627339" watchObservedRunningTime="2025-12-02 09:40:21.357195398 +0000 UTC m=+1184.181069277" Dec 02 09:40:21 crc kubenswrapper[4781]: I1202 09:40:21.364065 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-fq4g5" podStartSLOduration=5.238891749 podStartE2EDuration="50.364045293s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:33.330906615 +0000 UTC m=+1136.154780494" lastFinishedPulling="2025-12-02 09:40:18.456060149 +0000 UTC m=+1181.279934038" observedRunningTime="2025-12-02 09:40:21.353442286 +0000 UTC m=+1184.177316165" watchObservedRunningTime="2025-12-02 09:40:21.364045293 +0000 UTC m=+1184.187919172" Dec 02 09:40:22 crc kubenswrapper[4781]: I1202 09:40:22.065908 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-chbnm" event={"ID":"71cfd08b-278c-4f9c-b0fd-198c662ef00d","Type":"ContainerStarted","Data":"0043223d9870c9fb339290572d60dc6acdaae3110d3228e9c63a5ddcbdc3806b"} Dec 02 09:40:22 crc kubenswrapper[4781]: I1202 09:40:22.069233 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-chbnm" Dec 02 09:40:22 crc kubenswrapper[4781]: E1202 09:40:22.069832 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:14cfad6ea2e7f7ecc4cb2aafceb9c61514b3d04b66668832d1e4ac3b19f1ab81\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" podUID="cda1cc86-51ab-4070-96e9-98adba5d51c3" Dec 02 09:40:22 crc kubenswrapper[4781]: I1202 09:40:22.086284 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-chbnm" podStartSLOduration=2.560010588 podStartE2EDuration="51.086260485s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:32.988645003 +0000 UTC m=+1135.812518882" lastFinishedPulling="2025-12-02 09:40:21.5148949 +0000 UTC m=+1184.338768779" observedRunningTime="2025-12-02 09:40:22.080835658 +0000 UTC m=+1184.904709537" watchObservedRunningTime="2025-12-02 09:40:22.086260485 +0000 UTC m=+1184.910134384" Dec 02 09:40:23 crc kubenswrapper[4781]: I1202 09:40:23.086267 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" event={"ID":"cf3e832e-6140-4880-9efd-017837fc9990","Type":"ContainerStarted","Data":"5640a57019e5f6ee61765c67454d602fbef2986a401ccf37c1e2fbe45d841f5e"} Dec 02 09:40:23 crc kubenswrapper[4781]: I1202 09:40:23.086320 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" event={"ID":"cf3e832e-6140-4880-9efd-017837fc9990","Type":"ContainerStarted","Data":"9c35c9b887b0ed150a23e41a33dd3baa7e947248c6a9914116d42c7e31ccfa2c"} Dec 02 09:40:23 crc kubenswrapper[4781]: I1202 09:40:23.113036 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" podStartSLOduration=48.921525151 podStartE2EDuration="52.113013788s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:40:19.51898964 +0000 UTC m=+1182.342863519" lastFinishedPulling="2025-12-02 09:40:22.710478277 +0000 UTC m=+1185.534352156" observedRunningTime="2025-12-02 09:40:23.109647267 +0000 UTC m=+1185.933521146" watchObservedRunningTime="2025-12-02 09:40:23.113013788 +0000 UTC m=+1185.936887667" Dec 02 09:40:23 crc kubenswrapper[4781]: I1202 09:40:23.284420 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" Dec 02 09:40:30 crc kubenswrapper[4781]: I1202 09:40:30.412115 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:40:30 crc kubenswrapper[4781]: I1202 09:40:30.412875 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:40:31 crc kubenswrapper[4781]: I1202 09:40:31.459453 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gplp6" Dec 02 09:40:31 crc kubenswrapper[4781]: E1202 09:40:31.503030 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gsfd" podUID="8c93d2b8-28d1-4ea7-86aa-fe3f8ac35e61" Dec 02 09:40:31 crc kubenswrapper[4781]: I1202 09:40:31.725562 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-chbnm" Dec 02 09:40:31 crc kubenswrapper[4781]: I1202 09:40:31.751553 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-pkkst" Dec 02 09:40:31 crc kubenswrapper[4781]: I1202 09:40:31.797293 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-fq4g5" Dec 02 09:40:32 crc kubenswrapper[4781]: I1202 09:40:32.076487 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4dtmh" Dec 02 09:40:32 crc kubenswrapper[4781]: I1202 09:40:32.235186 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-xtks6" Dec 02 09:40:33 crc kubenswrapper[4781]: I1202 09:40:33.290556 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-lwgs7" Dec 02 09:40:35 crc kubenswrapper[4781]: E1202 09:40:35.500695 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz" podUID="c917a8ec-2bd5-4f7b-8948-a4bed859e01f" Dec 02 09:40:39 crc kubenswrapper[4781]: I1202 09:40:39.192603 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" event={"ID":"cda1cc86-51ab-4070-96e9-98adba5d51c3","Type":"ContainerStarted","Data":"6214b1b9bda9b697e02bb884b8fe365e5dfc8d8a1fdf4f59a30b771234ff5364"} Dec 02 09:40:39 crc kubenswrapper[4781]: I1202 09:40:39.193433 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" Dec 02 09:40:39 crc kubenswrapper[4781]: I1202 09:40:39.218543 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" podStartSLOduration=18.342331831 podStartE2EDuration="1m8.218527446s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:48.432534719 +0000 UTC m=+1151.256408598" lastFinishedPulling="2025-12-02 09:40:38.308730334 +0000 UTC m=+1201.132604213" observedRunningTime="2025-12-02 09:40:39.215937086 +0000 UTC m=+1202.039810975" watchObservedRunningTime="2025-12-02 09:40:39.218527446 +0000 UTC m=+1202.042401325" Dec 02 09:40:47 crc kubenswrapper[4781]: I1202 09:40:47.514892 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t" Dec 02 09:40:49 crc kubenswrapper[4781]: I1202 09:40:49.255572 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gsfd" event={"ID":"8c93d2b8-28d1-4ea7-86aa-fe3f8ac35e61","Type":"ContainerStarted","Data":"8baed2b7ee48cdebb53614baca2b1013d3f4641bc5a599dfed9aebfba1f86657"} Dec 02 09:40:49 crc kubenswrapper[4781]: I1202 09:40:49.270500 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gsfd" podStartSLOduration=1.6106103360000001 podStartE2EDuration="1m17.270482256s" podCreationTimestamp="2025-12-02 09:39:32 +0000 UTC" firstStartedPulling="2025-12-02 09:39:33.33181567 +0000 UTC m=+1136.155689549" lastFinishedPulling="2025-12-02 09:40:48.9916876 +0000 UTC m=+1211.815561469" observedRunningTime="2025-12-02 09:40:49.268369748 +0000 UTC m=+1212.092243637" watchObservedRunningTime="2025-12-02 09:40:49.270482256 +0000 UTC m=+1212.094356135" Dec 02 09:40:56 crc kubenswrapper[4781]: I1202 09:40:56.335034 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz" event={"ID":"c917a8ec-2bd5-4f7b-8948-a4bed859e01f","Type":"ContainerStarted","Data":"a698751e0d2e0af31ead58d64183dccc1b2d7df26d530ac83948e3fae797af8c"} Dec 02 09:40:56 crc kubenswrapper[4781]: I1202 09:40:56.335655 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz" Dec 02 09:40:56 crc kubenswrapper[4781]: I1202 09:40:56.353757 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz" podStartSLOduration=3.362568021 podStartE2EDuration="1m25.353742399s" podCreationTimestamp="2025-12-02 09:39:31 +0000 UTC" firstStartedPulling="2025-12-02 09:39:33.299875946 +0000 UTC m=+1136.123749825" lastFinishedPulling="2025-12-02 09:40:55.291050314 +0000 UTC m=+1218.114924203" observedRunningTime="2025-12-02 09:40:56.35159203 +0000 UTC m=+1219.175465919" watchObservedRunningTime="2025-12-02 09:40:56.353742399 +0000 UTC m=+1219.177616288" Dec 02 09:41:00 crc kubenswrapper[4781]: I1202 09:41:00.412065 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:41:00 crc kubenswrapper[4781]: I1202 09:41:00.412360 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:41:00 crc kubenswrapper[4781]: I1202 09:41:00.412406 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:41:00 crc kubenswrapper[4781]: I1202 09:41:00.413155 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"065de47b4d4ea37363c4ad12a4f2a35f47f5c31ffb6b9b5188b40ed91f236ce5"} pod="openshift-machine-config-operator/machine-config-daemon-pzntm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:41:00 crc kubenswrapper[4781]: I1202 09:41:00.413225 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" containerID="cri-o://065de47b4d4ea37363c4ad12a4f2a35f47f5c31ffb6b9b5188b40ed91f236ce5" gracePeriod=600 Dec 02 09:41:01 crc kubenswrapper[4781]: I1202 09:41:01.795122 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-8xclz" Dec 02 09:41:02 crc kubenswrapper[4781]: I1202 09:41:02.377724 4781 generic.go:334] "Generic (PLEG): container finished" podID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerID="065de47b4d4ea37363c4ad12a4f2a35f47f5c31ffb6b9b5188b40ed91f236ce5" exitCode=0 Dec 02 09:41:02 crc kubenswrapper[4781]: I1202 09:41:02.377772 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerDied","Data":"065de47b4d4ea37363c4ad12a4f2a35f47f5c31ffb6b9b5188b40ed91f236ce5"} Dec 02 09:41:02 crc kubenswrapper[4781]: I1202 09:41:02.377814 4781 scope.go:117] "RemoveContainer" containerID="4d8ba3ba707695a318e7a4765e850fbeffac37a0664ff39d2592ecb9000863ef" Dec 02 09:41:04 crc kubenswrapper[4781]: I1202 09:41:04.397804 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"da048919704efdd3fa5ef85ad71184cb8e182089a953151e1c406db7fb59ab6e"} Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.570668 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hcnql"] Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.573836 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-hcnql" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.576805 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.576889 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.577257 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.577694 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jnpvn" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.583250 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hcnql"] Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.659806 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rll7p"] Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.661424 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rll7p" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.663725 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.678046 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rll7p"] Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.726137 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f188f85e-b1c0-4a87-a057-16324e42a64a-config\") pod \"dnsmasq-dns-675f4bcbfc-hcnql\" (UID: \"f188f85e-b1c0-4a87-a057-16324e42a64a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hcnql" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.726222 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qts5m\" (UniqueName: \"kubernetes.io/projected/6dac8464-d82e-420d-93f4-d9c73ca4b209-kube-api-access-qts5m\") pod \"dnsmasq-dns-78dd6ddcc-rll7p\" (UID: \"6dac8464-d82e-420d-93f4-d9c73ca4b209\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rll7p" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.726250 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dac8464-d82e-420d-93f4-d9c73ca4b209-config\") pod \"dnsmasq-dns-78dd6ddcc-rll7p\" (UID: \"6dac8464-d82e-420d-93f4-d9c73ca4b209\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rll7p" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.726394 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grhhp\" (UniqueName: \"kubernetes.io/projected/f188f85e-b1c0-4a87-a057-16324e42a64a-kube-api-access-grhhp\") pod \"dnsmasq-dns-675f4bcbfc-hcnql\" (UID: \"f188f85e-b1c0-4a87-a057-16324e42a64a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hcnql" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.726468 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dac8464-d82e-420d-93f4-d9c73ca4b209-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rll7p\" (UID: \"6dac8464-d82e-420d-93f4-d9c73ca4b209\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rll7p" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.828233 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qts5m\" (UniqueName: \"kubernetes.io/projected/6dac8464-d82e-420d-93f4-d9c73ca4b209-kube-api-access-qts5m\") pod \"dnsmasq-dns-78dd6ddcc-rll7p\" (UID: \"6dac8464-d82e-420d-93f4-d9c73ca4b209\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rll7p" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.828289 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dac8464-d82e-420d-93f4-d9c73ca4b209-config\") pod \"dnsmasq-dns-78dd6ddcc-rll7p\" (UID: \"6dac8464-d82e-420d-93f4-d9c73ca4b209\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rll7p" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.828362 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grhhp\" (UniqueName: \"kubernetes.io/projected/f188f85e-b1c0-4a87-a057-16324e42a64a-kube-api-access-grhhp\") pod \"dnsmasq-dns-675f4bcbfc-hcnql\" (UID: \"f188f85e-b1c0-4a87-a057-16324e42a64a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hcnql" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.828404 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dac8464-d82e-420d-93f4-d9c73ca4b209-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rll7p\" (UID: \"6dac8464-d82e-420d-93f4-d9c73ca4b209\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rll7p" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.828448 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f188f85e-b1c0-4a87-a057-16324e42a64a-config\") pod \"dnsmasq-dns-675f4bcbfc-hcnql\" (UID: \"f188f85e-b1c0-4a87-a057-16324e42a64a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hcnql" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.829398 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dac8464-d82e-420d-93f4-d9c73ca4b209-config\") pod \"dnsmasq-dns-78dd6ddcc-rll7p\" (UID: \"6dac8464-d82e-420d-93f4-d9c73ca4b209\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rll7p" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.829443 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dac8464-d82e-420d-93f4-d9c73ca4b209-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rll7p\" (UID: \"6dac8464-d82e-420d-93f4-d9c73ca4b209\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rll7p" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.829581 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f188f85e-b1c0-4a87-a057-16324e42a64a-config\") pod \"dnsmasq-dns-675f4bcbfc-hcnql\" (UID: \"f188f85e-b1c0-4a87-a057-16324e42a64a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hcnql" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.849343 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grhhp\" (UniqueName: \"kubernetes.io/projected/f188f85e-b1c0-4a87-a057-16324e42a64a-kube-api-access-grhhp\") pod \"dnsmasq-dns-675f4bcbfc-hcnql\" (UID: \"f188f85e-b1c0-4a87-a057-16324e42a64a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hcnql" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.849365 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qts5m\" (UniqueName: \"kubernetes.io/projected/6dac8464-d82e-420d-93f4-d9c73ca4b209-kube-api-access-qts5m\") pod \"dnsmasq-dns-78dd6ddcc-rll7p\" (UID: \"6dac8464-d82e-420d-93f4-d9c73ca4b209\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rll7p" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.894328 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-hcnql" Dec 02 09:41:15 crc kubenswrapper[4781]: I1202 09:41:15.975546 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rll7p" Dec 02 09:41:16 crc kubenswrapper[4781]: I1202 09:41:16.337010 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hcnql"] Dec 02 09:41:16 crc kubenswrapper[4781]: W1202 09:41:16.337136 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf188f85e_b1c0_4a87_a057_16324e42a64a.slice/crio-ffe7b06b5dc77b25c98d82a64525e929bca33ba0eb5c162b56beac22b52c6121 WatchSource:0}: Error finding container ffe7b06b5dc77b25c98d82a64525e929bca33ba0eb5c162b56beac22b52c6121: Status 404 returned error can't find the container with id ffe7b06b5dc77b25c98d82a64525e929bca33ba0eb5c162b56beac22b52c6121 Dec 02 09:41:16 crc kubenswrapper[4781]: I1202 09:41:16.434590 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rll7p"] Dec 02 09:41:16 crc kubenswrapper[4781]: I1202 09:41:16.468741 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-rll7p" event={"ID":"6dac8464-d82e-420d-93f4-d9c73ca4b209","Type":"ContainerStarted","Data":"35b8cbeb611b4e57f45d9fe5796803b015a02b054200c3fe595802c8f5b8fca2"} Dec 02 09:41:16 crc kubenswrapper[4781]: I1202 09:41:16.469682 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-hcnql" event={"ID":"f188f85e-b1c0-4a87-a057-16324e42a64a","Type":"ContainerStarted","Data":"ffe7b06b5dc77b25c98d82a64525e929bca33ba0eb5c162b56beac22b52c6121"} Dec 02 09:41:18 crc kubenswrapper[4781]: I1202 09:41:18.821380 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hcnql"] Dec 02 09:41:18 crc kubenswrapper[4781]: I1202 09:41:18.852076 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bk67p"] Dec 02 09:41:18 crc kubenswrapper[4781]: I1202 09:41:18.853580 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-bk67p" Dec 02 09:41:18 crc kubenswrapper[4781]: I1202 09:41:18.912492 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bk67p"] Dec 02 09:41:18 crc kubenswrapper[4781]: I1202 09:41:18.968397 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-dns-svc\") pod \"dnsmasq-dns-666b6646f7-bk67p\" (UID: \"88e0f2e3-8550-4e01-bbfa-d48d1b5add07\") " pod="openstack/dnsmasq-dns-666b6646f7-bk67p" Dec 02 09:41:18 crc kubenswrapper[4781]: I1202 09:41:18.968465 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl9cd\" (UniqueName: \"kubernetes.io/projected/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-kube-api-access-wl9cd\") pod \"dnsmasq-dns-666b6646f7-bk67p\" (UID: \"88e0f2e3-8550-4e01-bbfa-d48d1b5add07\") " pod="openstack/dnsmasq-dns-666b6646f7-bk67p" Dec 02 09:41:18 crc kubenswrapper[4781]: I1202 09:41:18.968502 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-config\") pod \"dnsmasq-dns-666b6646f7-bk67p\" (UID: \"88e0f2e3-8550-4e01-bbfa-d48d1b5add07\") " pod="openstack/dnsmasq-dns-666b6646f7-bk67p" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.070614 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-dns-svc\") pod \"dnsmasq-dns-666b6646f7-bk67p\" (UID: \"88e0f2e3-8550-4e01-bbfa-d48d1b5add07\") " pod="openstack/dnsmasq-dns-666b6646f7-bk67p" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.070688 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl9cd\" (UniqueName: \"kubernetes.io/projected/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-kube-api-access-wl9cd\") pod \"dnsmasq-dns-666b6646f7-bk67p\" (UID: \"88e0f2e3-8550-4e01-bbfa-d48d1b5add07\") " pod="openstack/dnsmasq-dns-666b6646f7-bk67p" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.070718 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-config\") pod \"dnsmasq-dns-666b6646f7-bk67p\" (UID: \"88e0f2e3-8550-4e01-bbfa-d48d1b5add07\") " pod="openstack/dnsmasq-dns-666b6646f7-bk67p" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.071681 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-dns-svc\") pod \"dnsmasq-dns-666b6646f7-bk67p\" (UID: \"88e0f2e3-8550-4e01-bbfa-d48d1b5add07\") " pod="openstack/dnsmasq-dns-666b6646f7-bk67p" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.071685 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-config\") pod \"dnsmasq-dns-666b6646f7-bk67p\" (UID: \"88e0f2e3-8550-4e01-bbfa-d48d1b5add07\") " pod="openstack/dnsmasq-dns-666b6646f7-bk67p" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.105992 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rll7p"] Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.106032 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl9cd\" (UniqueName: \"kubernetes.io/projected/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-kube-api-access-wl9cd\") pod \"dnsmasq-dns-666b6646f7-bk67p\" (UID: \"88e0f2e3-8550-4e01-bbfa-d48d1b5add07\") " pod="openstack/dnsmasq-dns-666b6646f7-bk67p" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.151471 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sz5wn"] Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.152952 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.167970 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sz5wn"] Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.172871 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80e657e-6f81-4250-90e9-c63f77fb03b2-config\") pod \"dnsmasq-dns-57d769cc4f-sz5wn\" (UID: \"f80e657e-6f81-4250-90e9-c63f77fb03b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.173005 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr4jt\" (UniqueName: \"kubernetes.io/projected/f80e657e-6f81-4250-90e9-c63f77fb03b2-kube-api-access-vr4jt\") pod \"dnsmasq-dns-57d769cc4f-sz5wn\" (UID: \"f80e657e-6f81-4250-90e9-c63f77fb03b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.173048 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f80e657e-6f81-4250-90e9-c63f77fb03b2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-sz5wn\" (UID: \"f80e657e-6f81-4250-90e9-c63f77fb03b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.201475 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-bk67p" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.274797 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr4jt\" (UniqueName: \"kubernetes.io/projected/f80e657e-6f81-4250-90e9-c63f77fb03b2-kube-api-access-vr4jt\") pod \"dnsmasq-dns-57d769cc4f-sz5wn\" (UID: \"f80e657e-6f81-4250-90e9-c63f77fb03b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.275147 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f80e657e-6f81-4250-90e9-c63f77fb03b2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-sz5wn\" (UID: \"f80e657e-6f81-4250-90e9-c63f77fb03b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.275203 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80e657e-6f81-4250-90e9-c63f77fb03b2-config\") pod \"dnsmasq-dns-57d769cc4f-sz5wn\" (UID: \"f80e657e-6f81-4250-90e9-c63f77fb03b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.276073 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80e657e-6f81-4250-90e9-c63f77fb03b2-config\") pod \"dnsmasq-dns-57d769cc4f-sz5wn\" (UID: \"f80e657e-6f81-4250-90e9-c63f77fb03b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.276601 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f80e657e-6f81-4250-90e9-c63f77fb03b2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-sz5wn\" (UID: \"f80e657e-6f81-4250-90e9-c63f77fb03b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.296710 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr4jt\" (UniqueName: \"kubernetes.io/projected/f80e657e-6f81-4250-90e9-c63f77fb03b2-kube-api-access-vr4jt\") pod \"dnsmasq-dns-57d769cc4f-sz5wn\" (UID: \"f80e657e-6f81-4250-90e9-c63f77fb03b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.473522 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.704987 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bk67p"] Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.963320 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sz5wn"] Dec 02 09:41:19 crc kubenswrapper[4781]: W1202 09:41:19.967193 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf80e657e_6f81_4250_90e9_c63f77fb03b2.slice/crio-d48faf89f7c0c42ae72a1a4ac7132b2d5ec6ddf6c6cf7a94dfc12e33a50c4b8e WatchSource:0}: Error finding container d48faf89f7c0c42ae72a1a4ac7132b2d5ec6ddf6c6cf7a94dfc12e33a50c4b8e: Status 404 returned error can't find the container with id d48faf89f7c0c42ae72a1a4ac7132b2d5ec6ddf6c6cf7a94dfc12e33a50c4b8e Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.997860 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 09:41:19 crc kubenswrapper[4781]: I1202 09:41:19.999156 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.001220 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.001437 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.001734 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.001764 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6mkz6" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.001905 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.001950 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.008330 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.020485 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.187553 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.187594 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.187620 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-config-data\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.187644 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ef9f823-7456-4494-85c7-d29fcf35e7b5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.187677 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.187707 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.187764 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.187795 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.187812 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkktl\" (UniqueName: \"kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-kube-api-access-qkktl\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.187827 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ef9f823-7456-4494-85c7-d29fcf35e7b5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.187854 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.288812 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.288880 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkktl\" (UniqueName: \"kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-kube-api-access-qkktl\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.289005 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ef9f823-7456-4494-85c7-d29fcf35e7b5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.289878 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.290336 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.290385 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.290439 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-config-data\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.290501 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ef9f823-7456-4494-85c7-d29fcf35e7b5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.290593 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.290674 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.290882 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.291157 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.291424 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.291950 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.291969 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-config-data\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.292142 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.293414 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.295842 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.297120 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.299536 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ef9f823-7456-4494-85c7-d29fcf35e7b5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.300052 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ef9f823-7456-4494-85c7-d29fcf35e7b5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.302543 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.305831 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.309184 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.309450 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pk9hs" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.309683 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.309988 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.310456 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.310736 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.311041 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkktl\" (UniqueName: \"kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-kube-api-access-qkktl\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.311581 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.322292 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.344137 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.493523 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.493598 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.493627 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.493679 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.493810 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ndjp\" (UniqueName: \"kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-kube-api-access-7ndjp\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.493865 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.493990 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4d040259-d968-45a1-832a-45586a9fe0d1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.494030 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.494061 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4d040259-d968-45a1-832a-45586a9fe0d1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.494129 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.494167 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.497688 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" event={"ID":"f80e657e-6f81-4250-90e9-c63f77fb03b2","Type":"ContainerStarted","Data":"d48faf89f7c0c42ae72a1a4ac7132b2d5ec6ddf6c6cf7a94dfc12e33a50c4b8e"} Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.498658 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-bk67p" event={"ID":"88e0f2e3-8550-4e01-bbfa-d48d1b5add07","Type":"ContainerStarted","Data":"73f2dbda8c45276c80ba198242c11f5058e6e7ed7bb45a486d0b1630bb056240"} Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.597481 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.597539 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.597566 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.597583 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.597601 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.597620 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.597645 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ndjp\" (UniqueName: \"kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-kube-api-access-7ndjp\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.597663 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.597684 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4d040259-d968-45a1-832a-45586a9fe0d1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.597709 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.597732 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4d040259-d968-45a1-832a-45586a9fe0d1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.597887 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.598733 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.598770 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.598887 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.599013 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.599895 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.601188 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.601457 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4d040259-d968-45a1-832a-45586a9fe0d1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.602950 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.604296 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4d040259-d968-45a1-832a-45586a9fe0d1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.621600 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ndjp\" (UniqueName: \"kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-kube-api-access-7ndjp\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.627135 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.628115 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.698963 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:41:20 crc kubenswrapper[4781]: I1202 09:41:20.857912 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 09:41:20 crc kubenswrapper[4781]: W1202 09:41:20.872253 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ef9f823_7456_4494_85c7_d29fcf35e7b5.slice/crio-5adf3a424f5a9a7053218f386b18c9130d5bcd2f34ed07f4b5c637298b1a2ab0 WatchSource:0}: Error finding container 5adf3a424f5a9a7053218f386b18c9130d5bcd2f34ed07f4b5c637298b1a2ab0: Status 404 returned error can't find the container with id 5adf3a424f5a9a7053218f386b18c9130d5bcd2f34ed07f4b5c637298b1a2ab0 Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.163320 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 09:41:21 crc kubenswrapper[4781]: W1202 09:41:21.164916 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d040259_d968_45a1_832a_45586a9fe0d1.slice/crio-32e4c1f22b9bb8c4ef6ee1fcd7631f1c7f54de11afbca44eac23fc817341db6d WatchSource:0}: Error finding container 32e4c1f22b9bb8c4ef6ee1fcd7631f1c7f54de11afbca44eac23fc817341db6d: Status 404 returned error can't find the container with id 32e4c1f22b9bb8c4ef6ee1fcd7631f1c7f54de11afbca44eac23fc817341db6d Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.512481 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9ef9f823-7456-4494-85c7-d29fcf35e7b5","Type":"ContainerStarted","Data":"5adf3a424f5a9a7053218f386b18c9130d5bcd2f34ed07f4b5c637298b1a2ab0"} Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.512540 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4d040259-d968-45a1-832a-45586a9fe0d1","Type":"ContainerStarted","Data":"32e4c1f22b9bb8c4ef6ee1fcd7631f1c7f54de11afbca44eac23fc817341db6d"} Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.581837 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.584210 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.588080 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.588491 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-4ps6n" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.588615 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.588855 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.597319 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.598025 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.715008 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.715276 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/39428d69-6b86-41ea-8c4f-5532a5283a91-kolla-config\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.716153 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56vn2\" (UniqueName: \"kubernetes.io/projected/39428d69-6b86-41ea-8c4f-5532a5283a91-kube-api-access-56vn2\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.716265 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/39428d69-6b86-41ea-8c4f-5532a5283a91-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.716295 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/39428d69-6b86-41ea-8c4f-5532a5283a91-config-data-default\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.716393 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/39428d69-6b86-41ea-8c4f-5532a5283a91-config-data-generated\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.716489 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39428d69-6b86-41ea-8c4f-5532a5283a91-operator-scripts\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.716593 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39428d69-6b86-41ea-8c4f-5532a5283a91-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.817418 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39428d69-6b86-41ea-8c4f-5532a5283a91-operator-scripts\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.817478 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39428d69-6b86-41ea-8c4f-5532a5283a91-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.817505 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.817525 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/39428d69-6b86-41ea-8c4f-5532a5283a91-kolla-config\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.817566 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56vn2\" (UniqueName: \"kubernetes.io/projected/39428d69-6b86-41ea-8c4f-5532a5283a91-kube-api-access-56vn2\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.817615 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/39428d69-6b86-41ea-8c4f-5532a5283a91-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.817639 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/39428d69-6b86-41ea-8c4f-5532a5283a91-config-data-default\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.817675 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/39428d69-6b86-41ea-8c4f-5532a5283a91-config-data-generated\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.817870 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.818041 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/39428d69-6b86-41ea-8c4f-5532a5283a91-config-data-generated\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.818568 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/39428d69-6b86-41ea-8c4f-5532a5283a91-kolla-config\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.819177 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/39428d69-6b86-41ea-8c4f-5532a5283a91-config-data-default\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.824609 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/39428d69-6b86-41ea-8c4f-5532a5283a91-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.837782 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56vn2\" (UniqueName: \"kubernetes.io/projected/39428d69-6b86-41ea-8c4f-5532a5283a91-kube-api-access-56vn2\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.838281 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39428d69-6b86-41ea-8c4f-5532a5283a91-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:21 crc kubenswrapper[4781]: I1202 09:41:21.847084 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:22 crc kubenswrapper[4781]: I1202 09:41:22.113029 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39428d69-6b86-41ea-8c4f-5532a5283a91-operator-scripts\") pod \"openstack-galera-0\" (UID: \"39428d69-6b86-41ea-8c4f-5532a5283a91\") " pod="openstack/openstack-galera-0" Dec 02 09:41:22 crc kubenswrapper[4781]: I1202 09:41:22.218678 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 09:41:22 crc kubenswrapper[4781]: I1202 09:41:22.552013 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.075936 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.077251 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.082115 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.082363 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.082419 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.082373 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-62jcw" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.091885 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.236787 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/05cec06e-b8e4-487d-b4f8-0691aaf1f997-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.236849 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05cec06e-b8e4-487d-b4f8-0691aaf1f997-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.236888 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05cec06e-b8e4-487d-b4f8-0691aaf1f997-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.236936 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/05cec06e-b8e4-487d-b4f8-0691aaf1f997-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.236959 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkm9k\" (UniqueName: \"kubernetes.io/projected/05cec06e-b8e4-487d-b4f8-0691aaf1f997-kube-api-access-wkm9k\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.237020 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/05cec06e-b8e4-487d-b4f8-0691aaf1f997-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.237046 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05cec06e-b8e4-487d-b4f8-0691aaf1f997-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.237140 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.342599 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/05cec06e-b8e4-487d-b4f8-0691aaf1f997-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.342661 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05cec06e-b8e4-487d-b4f8-0691aaf1f997-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.342703 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.342746 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/05cec06e-b8e4-487d-b4f8-0691aaf1f997-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.342767 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05cec06e-b8e4-487d-b4f8-0691aaf1f997-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.342795 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05cec06e-b8e4-487d-b4f8-0691aaf1f997-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.342818 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/05cec06e-b8e4-487d-b4f8-0691aaf1f997-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.342839 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkm9k\" (UniqueName: \"kubernetes.io/projected/05cec06e-b8e4-487d-b4f8-0691aaf1f997-kube-api-access-wkm9k\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.343942 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/05cec06e-b8e4-487d-b4f8-0691aaf1f997-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.344947 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05cec06e-b8e4-487d-b4f8-0691aaf1f997-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.345195 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.345250 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/05cec06e-b8e4-487d-b4f8-0691aaf1f997-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.345639 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05cec06e-b8e4-487d-b4f8-0691aaf1f997-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.357735 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/05cec06e-b8e4-487d-b4f8-0691aaf1f997-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.357781 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05cec06e-b8e4-487d-b4f8-0691aaf1f997-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.361914 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkm9k\" (UniqueName: \"kubernetes.io/projected/05cec06e-b8e4-487d-b4f8-0691aaf1f997-kube-api-access-wkm9k\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.379054 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"05cec06e-b8e4-487d-b4f8-0691aaf1f997\") " pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.397711 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.457002 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.458086 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.468548 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.468784 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-knxvm" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.469039 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.484338 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.546613 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd68438-412a-4745-9e59-f4c9374f2444-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2fd68438-412a-4745-9e59-f4c9374f2444\") " pod="openstack/memcached-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.546661 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd68438-412a-4745-9e59-f4c9374f2444-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2fd68438-412a-4745-9e59-f4c9374f2444\") " pod="openstack/memcached-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.546713 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk268\" (UniqueName: \"kubernetes.io/projected/2fd68438-412a-4745-9e59-f4c9374f2444-kube-api-access-kk268\") pod \"memcached-0\" (UID: \"2fd68438-412a-4745-9e59-f4c9374f2444\") " pod="openstack/memcached-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.546757 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2fd68438-412a-4745-9e59-f4c9374f2444-config-data\") pod \"memcached-0\" (UID: \"2fd68438-412a-4745-9e59-f4c9374f2444\") " pod="openstack/memcached-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.546794 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2fd68438-412a-4745-9e59-f4c9374f2444-kolla-config\") pod \"memcached-0\" (UID: \"2fd68438-412a-4745-9e59-f4c9374f2444\") " pod="openstack/memcached-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.552329 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"39428d69-6b86-41ea-8c4f-5532a5283a91","Type":"ContainerStarted","Data":"4bcffdd741c32ac08851aa8a855c3a412847866e0a4e00eb320beeff2426bf2b"} Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.647749 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2fd68438-412a-4745-9e59-f4c9374f2444-config-data\") pod \"memcached-0\" (UID: \"2fd68438-412a-4745-9e59-f4c9374f2444\") " pod="openstack/memcached-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.647814 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2fd68438-412a-4745-9e59-f4c9374f2444-kolla-config\") pod \"memcached-0\" (UID: \"2fd68438-412a-4745-9e59-f4c9374f2444\") " pod="openstack/memcached-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.647868 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd68438-412a-4745-9e59-f4c9374f2444-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2fd68438-412a-4745-9e59-f4c9374f2444\") " pod="openstack/memcached-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.647889 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd68438-412a-4745-9e59-f4c9374f2444-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2fd68438-412a-4745-9e59-f4c9374f2444\") " pod="openstack/memcached-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.647939 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk268\" (UniqueName: \"kubernetes.io/projected/2fd68438-412a-4745-9e59-f4c9374f2444-kube-api-access-kk268\") pod \"memcached-0\" (UID: \"2fd68438-412a-4745-9e59-f4c9374f2444\") " pod="openstack/memcached-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.654392 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2fd68438-412a-4745-9e59-f4c9374f2444-kolla-config\") pod \"memcached-0\" (UID: \"2fd68438-412a-4745-9e59-f4c9374f2444\") " pod="openstack/memcached-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.654837 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2fd68438-412a-4745-9e59-f4c9374f2444-config-data\") pod \"memcached-0\" (UID: \"2fd68438-412a-4745-9e59-f4c9374f2444\") " pod="openstack/memcached-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.657276 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd68438-412a-4745-9e59-f4c9374f2444-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2fd68438-412a-4745-9e59-f4c9374f2444\") " pod="openstack/memcached-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.662653 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd68438-412a-4745-9e59-f4c9374f2444-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2fd68438-412a-4745-9e59-f4c9374f2444\") " pod="openstack/memcached-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.689147 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk268\" (UniqueName: \"kubernetes.io/projected/2fd68438-412a-4745-9e59-f4c9374f2444-kube-api-access-kk268\") pod \"memcached-0\" (UID: \"2fd68438-412a-4745-9e59-f4c9374f2444\") " pod="openstack/memcached-0" Dec 02 09:41:23 crc kubenswrapper[4781]: I1202 09:41:23.852588 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 09:41:24 crc kubenswrapper[4781]: I1202 09:41:24.290500 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 09:41:24 crc kubenswrapper[4781]: W1202 09:41:24.298798 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05cec06e_b8e4_487d_b4f8_0691aaf1f997.slice/crio-dae303f384dad02cab83ec2dccbd118d981ca3b9c70b7e2f89ddb96475d9e76b WatchSource:0}: Error finding container dae303f384dad02cab83ec2dccbd118d981ca3b9c70b7e2f89ddb96475d9e76b: Status 404 returned error can't find the container with id dae303f384dad02cab83ec2dccbd118d981ca3b9c70b7e2f89ddb96475d9e76b Dec 02 09:41:24 crc kubenswrapper[4781]: I1202 09:41:24.583250 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"05cec06e-b8e4-487d-b4f8-0691aaf1f997","Type":"ContainerStarted","Data":"dae303f384dad02cab83ec2dccbd118d981ca3b9c70b7e2f89ddb96475d9e76b"} Dec 02 09:41:24 crc kubenswrapper[4781]: I1202 09:41:24.667137 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 09:41:25 crc kubenswrapper[4781]: I1202 09:41:25.481737 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 09:41:25 crc kubenswrapper[4781]: I1202 09:41:25.482893 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 09:41:25 crc kubenswrapper[4781]: I1202 09:41:25.487046 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-996x9" Dec 02 09:41:25 crc kubenswrapper[4781]: I1202 09:41:25.500604 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 09:41:25 crc kubenswrapper[4781]: I1202 09:41:25.625099 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2fd68438-412a-4745-9e59-f4c9374f2444","Type":"ContainerStarted","Data":"d63005cce7a089efe9774dcac42d6337c617caf3627601d577185c2738524022"} Dec 02 09:41:25 crc kubenswrapper[4781]: I1202 09:41:25.634091 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld4pf\" (UniqueName: \"kubernetes.io/projected/284c6c26-76ca-4800-b40f-51528de0c015-kube-api-access-ld4pf\") pod \"kube-state-metrics-0\" (UID: \"284c6c26-76ca-4800-b40f-51528de0c015\") " pod="openstack/kube-state-metrics-0" Dec 02 09:41:25 crc kubenswrapper[4781]: I1202 09:41:25.735208 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld4pf\" (UniqueName: \"kubernetes.io/projected/284c6c26-76ca-4800-b40f-51528de0c015-kube-api-access-ld4pf\") pod \"kube-state-metrics-0\" (UID: \"284c6c26-76ca-4800-b40f-51528de0c015\") " pod="openstack/kube-state-metrics-0" Dec 02 09:41:25 crc kubenswrapper[4781]: I1202 09:41:25.761439 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld4pf\" (UniqueName: \"kubernetes.io/projected/284c6c26-76ca-4800-b40f-51528de0c015-kube-api-access-ld4pf\") pod \"kube-state-metrics-0\" (UID: \"284c6c26-76ca-4800-b40f-51528de0c015\") " pod="openstack/kube-state-metrics-0" Dec 02 09:41:25 crc kubenswrapper[4781]: I1202 09:41:25.804962 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 09:41:27 crc kubenswrapper[4781]: I1202 09:41:27.122733 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 09:41:27 crc kubenswrapper[4781]: I1202 09:41:27.775316 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"284c6c26-76ca-4800-b40f-51528de0c015","Type":"ContainerStarted","Data":"27c6e4ded22a7b7f93cac6065ac88dcb766b42ab9f275e7c2e564b1f0ef2b18d"} Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.879144 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mmhrv"] Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.885737 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.888272 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.888604 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-nmvqv" Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.892949 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.893025 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mmhrv"] Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.929313 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-rh5wv"] Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.930755 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.944692 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rh5wv"] Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.956419 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8w8w\" (UniqueName: \"kubernetes.io/projected/7cf924dd-8243-4263-85a2-68ac01fd5346-kube-api-access-l8w8w\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.956464 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77d1da39-d5b0-4ec8-8196-f4f1025291f8-scripts\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.956486 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cf924dd-8243-4263-85a2-68ac01fd5346-ovn-controller-tls-certs\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.956507 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/77d1da39-d5b0-4ec8-8196-f4f1025291f8-var-run\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.956532 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/77d1da39-d5b0-4ec8-8196-f4f1025291f8-var-log\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.956944 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/77d1da39-d5b0-4ec8-8196-f4f1025291f8-var-lib\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.957073 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7cf924dd-8243-4263-85a2-68ac01fd5346-var-run-ovn\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.957161 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7cf924dd-8243-4263-85a2-68ac01fd5346-var-run\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.957189 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf924dd-8243-4263-85a2-68ac01fd5346-combined-ca-bundle\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.957271 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq8x9\" (UniqueName: \"kubernetes.io/projected/77d1da39-d5b0-4ec8-8196-f4f1025291f8-kube-api-access-nq8x9\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.957395 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7cf924dd-8243-4263-85a2-68ac01fd5346-var-log-ovn\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.957443 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/77d1da39-d5b0-4ec8-8196-f4f1025291f8-etc-ovs\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:29 crc kubenswrapper[4781]: I1202 09:41:29.957471 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cf924dd-8243-4263-85a2-68ac01fd5346-scripts\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.062263 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7cf924dd-8243-4263-85a2-68ac01fd5346-var-run\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.062322 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf924dd-8243-4263-85a2-68ac01fd5346-combined-ca-bundle\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.062366 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq8x9\" (UniqueName: \"kubernetes.io/projected/77d1da39-d5b0-4ec8-8196-f4f1025291f8-kube-api-access-nq8x9\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.062389 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7cf924dd-8243-4263-85a2-68ac01fd5346-var-log-ovn\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.062409 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/77d1da39-d5b0-4ec8-8196-f4f1025291f8-etc-ovs\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.062428 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cf924dd-8243-4263-85a2-68ac01fd5346-scripts\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.062450 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8w8w\" (UniqueName: \"kubernetes.io/projected/7cf924dd-8243-4263-85a2-68ac01fd5346-kube-api-access-l8w8w\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.062473 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77d1da39-d5b0-4ec8-8196-f4f1025291f8-scripts\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.062489 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cf924dd-8243-4263-85a2-68ac01fd5346-ovn-controller-tls-certs\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.062509 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/77d1da39-d5b0-4ec8-8196-f4f1025291f8-var-run\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.062538 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/77d1da39-d5b0-4ec8-8196-f4f1025291f8-var-log\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.062556 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/77d1da39-d5b0-4ec8-8196-f4f1025291f8-var-lib\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.062610 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7cf924dd-8243-4263-85a2-68ac01fd5346-var-run-ovn\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.062981 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7cf924dd-8243-4263-85a2-68ac01fd5346-var-run-ovn\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.063040 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7cf924dd-8243-4263-85a2-68ac01fd5346-var-run\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.063991 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/77d1da39-d5b0-4ec8-8196-f4f1025291f8-var-run\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.065269 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7cf924dd-8243-4263-85a2-68ac01fd5346-var-log-ovn\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.065455 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/77d1da39-d5b0-4ec8-8196-f4f1025291f8-etc-ovs\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.065535 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/77d1da39-d5b0-4ec8-8196-f4f1025291f8-var-log\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.066216 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77d1da39-d5b0-4ec8-8196-f4f1025291f8-scripts\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.066395 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/77d1da39-d5b0-4ec8-8196-f4f1025291f8-var-lib\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.070714 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cf924dd-8243-4263-85a2-68ac01fd5346-ovn-controller-tls-certs\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.090897 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cf924dd-8243-4263-85a2-68ac01fd5346-scripts\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.117843 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf924dd-8243-4263-85a2-68ac01fd5346-combined-ca-bundle\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.127616 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8w8w\" (UniqueName: \"kubernetes.io/projected/7cf924dd-8243-4263-85a2-68ac01fd5346-kube-api-access-l8w8w\") pod \"ovn-controller-mmhrv\" (UID: \"7cf924dd-8243-4263-85a2-68ac01fd5346\") " pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.127695 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq8x9\" (UniqueName: \"kubernetes.io/projected/77d1da39-d5b0-4ec8-8196-f4f1025291f8-kube-api-access-nq8x9\") pod \"ovn-controller-ovs-rh5wv\" (UID: \"77d1da39-d5b0-4ec8-8196-f4f1025291f8\") " pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.229117 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mmhrv" Dec 02 09:41:30 crc kubenswrapper[4781]: I1202 09:41:30.262397 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:41:31 crc kubenswrapper[4781]: I1202 09:41:31.102299 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mmhrv"] Dec 02 09:41:31 crc kubenswrapper[4781]: I1202 09:41:31.383713 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rh5wv"] Dec 02 09:41:31 crc kubenswrapper[4781]: I1202 09:41:31.972578 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-7c2h7"] Dec 02 09:41:31 crc kubenswrapper[4781]: I1202 09:41:31.973861 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:31 crc kubenswrapper[4781]: I1202 09:41:31.975867 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7c2h7"] Dec 02 09:41:31 crc kubenswrapper[4781]: I1202 09:41:31.977425 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 02 09:41:31 crc kubenswrapper[4781]: I1202 09:41:31.977830 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.111074 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-ovn-rundir\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.111143 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.111180 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-config\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.111230 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npfh5\" (UniqueName: \"kubernetes.io/projected/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-kube-api-access-npfh5\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.111256 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-combined-ca-bundle\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.111294 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-ovs-rundir\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.212759 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-ovn-rundir\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.212817 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.212843 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-config\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.212880 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npfh5\" (UniqueName: \"kubernetes.io/projected/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-kube-api-access-npfh5\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.212897 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-combined-ca-bundle\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.212935 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-ovs-rundir\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.213224 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-ovs-rundir\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.213275 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-ovn-rundir\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.214990 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-config\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.221841 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.233917 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-combined-ca-bundle\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.236538 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npfh5\" (UniqueName: \"kubernetes.io/projected/7bc2040b-e9d5-4a1a-9d46-6b50dbc71061-kube-api-access-npfh5\") pod \"ovn-controller-metrics-7c2h7\" (UID: \"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061\") " pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:32 crc kubenswrapper[4781]: I1202 09:41:32.295114 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7c2h7" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.137841 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.139386 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.143119 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.143404 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.154840 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-6x7h9" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.155200 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.182715 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.241519 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.241565 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dea0cb6-7707-46ba-bd47-89ce579fdad9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.241616 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dea0cb6-7707-46ba-bd47-89ce579fdad9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.241633 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dea0cb6-7707-46ba-bd47-89ce579fdad9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.241660 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rczjj\" (UniqueName: \"kubernetes.io/projected/6dea0cb6-7707-46ba-bd47-89ce579fdad9-kube-api-access-rczjj\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.241680 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dea0cb6-7707-46ba-bd47-89ce579fdad9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.241694 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dea0cb6-7707-46ba-bd47-89ce579fdad9-config\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.241709 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6dea0cb6-7707-46ba-bd47-89ce579fdad9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.346307 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dea0cb6-7707-46ba-bd47-89ce579fdad9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.346369 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dea0cb6-7707-46ba-bd47-89ce579fdad9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.346436 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczjj\" (UniqueName: \"kubernetes.io/projected/6dea0cb6-7707-46ba-bd47-89ce579fdad9-kube-api-access-rczjj\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.346467 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dea0cb6-7707-46ba-bd47-89ce579fdad9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.346501 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dea0cb6-7707-46ba-bd47-89ce579fdad9-config\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.346519 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6dea0cb6-7707-46ba-bd47-89ce579fdad9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.346634 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.346690 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dea0cb6-7707-46ba-bd47-89ce579fdad9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.348627 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.350851 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.352562 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dea0cb6-7707-46ba-bd47-89ce579fdad9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.354117 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6dea0cb6-7707-46ba-bd47-89ce579fdad9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.354390 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.354397 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dea0cb6-7707-46ba-bd47-89ce579fdad9-config\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.362660 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-mk9zf" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.362978 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.363110 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.363212 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.364372 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dea0cb6-7707-46ba-bd47-89ce579fdad9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.366240 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.372077 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dea0cb6-7707-46ba-bd47-89ce579fdad9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.377873 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dea0cb6-7707-46ba-bd47-89ce579fdad9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.395944 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczjj\" (UniqueName: \"kubernetes.io/projected/6dea0cb6-7707-46ba-bd47-89ce579fdad9-kube-api-access-rczjj\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.405479 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6dea0cb6-7707-46ba-bd47-89ce579fdad9\") " pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.474993 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.553982 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c135676-b0d9-469f-82b2-59483c9712f1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.554062 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.554086 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c135676-b0d9-469f-82b2-59483c9712f1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.554111 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c135676-b0d9-469f-82b2-59483c9712f1-config\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.554142 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c135676-b0d9-469f-82b2-59483c9712f1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.554159 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c135676-b0d9-469f-82b2-59483c9712f1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.554188 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpbjj\" (UniqueName: \"kubernetes.io/projected/8c135676-b0d9-469f-82b2-59483c9712f1-kube-api-access-cpbjj\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.554220 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c135676-b0d9-469f-82b2-59483c9712f1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.656065 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c135676-b0d9-469f-82b2-59483c9712f1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.656868 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c135676-b0d9-469f-82b2-59483c9712f1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.657166 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.657228 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c135676-b0d9-469f-82b2-59483c9712f1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.657254 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c135676-b0d9-469f-82b2-59483c9712f1-config\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.657350 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c135676-b0d9-469f-82b2-59483c9712f1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.657386 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c135676-b0d9-469f-82b2-59483c9712f1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.657461 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpbjj\" (UniqueName: \"kubernetes.io/projected/8c135676-b0d9-469f-82b2-59483c9712f1-kube-api-access-cpbjj\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.657979 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.658272 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c135676-b0d9-469f-82b2-59483c9712f1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.658426 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c135676-b0d9-469f-82b2-59483c9712f1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.658435 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c135676-b0d9-469f-82b2-59483c9712f1-config\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.664280 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c135676-b0d9-469f-82b2-59483c9712f1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.665481 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c135676-b0d9-469f-82b2-59483c9712f1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.673795 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpbjj\" (UniqueName: \"kubernetes.io/projected/8c135676-b0d9-469f-82b2-59483c9712f1-kube-api-access-cpbjj\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.686360 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c135676-b0d9-469f-82b2-59483c9712f1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.705814 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8c135676-b0d9-469f-82b2-59483c9712f1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:33 crc kubenswrapper[4781]: I1202 09:41:33.776202 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 09:41:43 crc kubenswrapper[4781]: I1202 09:41:43.026675 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mmhrv" event={"ID":"7cf924dd-8243-4263-85a2-68ac01fd5346","Type":"ContainerStarted","Data":"82daf9e93c6d43edd3f4cec42f4ad53ab1ebdb14fb76a49e092d93654fb5f60f"} Dec 02 09:41:43 crc kubenswrapper[4781]: I1202 09:41:43.028152 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rh5wv" event={"ID":"77d1da39-d5b0-4ec8-8196-f4f1025291f8","Type":"ContainerStarted","Data":"c1edb824702576470fc23d22fb635585735edd608168aa4aa5124d57fa68f7b1"} Dec 02 09:41:54 crc kubenswrapper[4781]: E1202 09:41:54.903429 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 09:41:54 crc kubenswrapper[4781]: E1202 09:41:54.904041 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qts5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-rll7p_openstack(6dac8464-d82e-420d-93f4-d9c73ca4b209): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:41:54 crc kubenswrapper[4781]: E1202 09:41:54.905220 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-rll7p" podUID="6dac8464-d82e-420d-93f4-d9c73ca4b209" Dec 02 09:41:56 crc kubenswrapper[4781]: E1202 09:41:56.707636 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 02 09:41:56 crc kubenswrapper[4781]: E1202 09:41:56.708379 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wkm9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(05cec06e-b8e4-487d-b4f8-0691aaf1f997): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:41:56 crc kubenswrapper[4781]: E1202 09:41:56.709500 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="05cec06e-b8e4-487d-b4f8-0691aaf1f997" Dec 02 09:41:57 crc kubenswrapper[4781]: E1202 09:41:57.138015 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="05cec06e-b8e4-487d-b4f8-0691aaf1f997" Dec 02 09:42:13 crc kubenswrapper[4781]: E1202 09:42:13.956132 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 02 09:42:13 crc kubenswrapper[4781]: E1202 09:42:13.956880 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-56vn2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(39428d69-6b86-41ea-8c4f-5532a5283a91): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:42:13 crc kubenswrapper[4781]: E1202 09:42:13.958328 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="39428d69-6b86-41ea-8c4f-5532a5283a91" Dec 02 09:42:13 crc kubenswrapper[4781]: E1202 09:42:13.964577 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 02 09:42:13 crc kubenswrapper[4781]: E1202 09:42:13.964744 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7ndjp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(4d040259-d968-45a1-832a-45586a9fe0d1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:42:13 crc kubenswrapper[4781]: E1202 09:42:13.966251 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="4d040259-d968-45a1-832a-45586a9fe0d1" Dec 02 09:42:13 crc kubenswrapper[4781]: E1202 09:42:13.985634 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 09:42:13 crc kubenswrapper[4781]: E1202 09:42:13.985799 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wl9cd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-bk67p_openstack(88e0f2e3-8550-4e01-bbfa-d48d1b5add07): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:42:13 crc kubenswrapper[4781]: E1202 09:42:13.987229 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-bk67p" podUID="88e0f2e3-8550-4e01-bbfa-d48d1b5add07" Dec 02 09:42:13 crc kubenswrapper[4781]: E1202 09:42:13.997833 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 09:42:13 crc kubenswrapper[4781]: E1202 09:42:13.997973 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vr4jt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-sz5wn_openstack(f80e657e-6f81-4250-90e9-c63f77fb03b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:42:14 crc kubenswrapper[4781]: E1202 09:42:13.999979 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" podUID="f80e657e-6f81-4250-90e9-c63f77fb03b2" Dec 02 09:42:14 crc kubenswrapper[4781]: E1202 09:42:14.262183 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-bk67p" podUID="88e0f2e3-8550-4e01-bbfa-d48d1b5add07" Dec 02 09:42:14 crc kubenswrapper[4781]: E1202 09:42:14.262333 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="39428d69-6b86-41ea-8c4f-5532a5283a91" Dec 02 09:42:14 crc kubenswrapper[4781]: E1202 09:42:14.262424 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="4d040259-d968-45a1-832a-45586a9fe0d1" Dec 02 09:42:14 crc kubenswrapper[4781]: E1202 09:42:14.262727 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" podUID="f80e657e-6f81-4250-90e9-c63f77fb03b2" Dec 02 09:42:17 crc kubenswrapper[4781]: E1202 09:42:17.608197 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 02 09:42:17 crc kubenswrapper[4781]: E1202 09:42:17.608995 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qkktl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(9ef9f823-7456-4494-85c7-d29fcf35e7b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:42:17 crc kubenswrapper[4781]: E1202 09:42:17.614122 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="9ef9f823-7456-4494-85c7-d29fcf35e7b5" Dec 02 09:42:17 crc kubenswrapper[4781]: E1202 09:42:17.640185 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Dec 02 09:42:17 crc kubenswrapper[4781]: E1202 09:42:17.640349 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n66fh557h658h568h5f6h7dhb9h65ch4h5ch694h544h585h569h657h7fhdbh64bh5cch64bh7fh76hbch56ch88h655h67h587h54bh688h55h5b5q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kk268,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(2fd68438-412a-4745-9e59-f4c9374f2444): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:42:17 crc kubenswrapper[4781]: E1202 09:42:17.646008 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="2fd68438-412a-4745-9e59-f4c9374f2444" Dec 02 09:42:17 crc kubenswrapper[4781]: I1202 09:42:17.754132 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rll7p" Dec 02 09:42:17 crc kubenswrapper[4781]: I1202 09:42:17.837812 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dac8464-d82e-420d-93f4-d9c73ca4b209-dns-svc\") pod \"6dac8464-d82e-420d-93f4-d9c73ca4b209\" (UID: \"6dac8464-d82e-420d-93f4-d9c73ca4b209\") " Dec 02 09:42:17 crc kubenswrapper[4781]: I1202 09:42:17.837989 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dac8464-d82e-420d-93f4-d9c73ca4b209-config\") pod \"6dac8464-d82e-420d-93f4-d9c73ca4b209\" (UID: \"6dac8464-d82e-420d-93f4-d9c73ca4b209\") " Dec 02 09:42:17 crc kubenswrapper[4781]: I1202 09:42:17.838260 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qts5m\" (UniqueName: \"kubernetes.io/projected/6dac8464-d82e-420d-93f4-d9c73ca4b209-kube-api-access-qts5m\") pod \"6dac8464-d82e-420d-93f4-d9c73ca4b209\" (UID: \"6dac8464-d82e-420d-93f4-d9c73ca4b209\") " Dec 02 09:42:17 crc kubenswrapper[4781]: I1202 09:42:17.838553 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dac8464-d82e-420d-93f4-d9c73ca4b209-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6dac8464-d82e-420d-93f4-d9c73ca4b209" (UID: "6dac8464-d82e-420d-93f4-d9c73ca4b209"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:17 crc kubenswrapper[4781]: I1202 09:42:17.838560 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dac8464-d82e-420d-93f4-d9c73ca4b209-config" (OuterVolumeSpecName: "config") pod "6dac8464-d82e-420d-93f4-d9c73ca4b209" (UID: "6dac8464-d82e-420d-93f4-d9c73ca4b209"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:17 crc kubenswrapper[4781]: I1202 09:42:17.838892 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dac8464-d82e-420d-93f4-d9c73ca4b209-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:17 crc kubenswrapper[4781]: I1202 09:42:17.838956 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dac8464-d82e-420d-93f4-d9c73ca4b209-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:17 crc kubenswrapper[4781]: I1202 09:42:17.858938 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dac8464-d82e-420d-93f4-d9c73ca4b209-kube-api-access-qts5m" (OuterVolumeSpecName: "kube-api-access-qts5m") pod "6dac8464-d82e-420d-93f4-d9c73ca4b209" (UID: "6dac8464-d82e-420d-93f4-d9c73ca4b209"). InnerVolumeSpecName "kube-api-access-qts5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:42:17 crc kubenswrapper[4781]: I1202 09:42:17.940449 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qts5m\" (UniqueName: \"kubernetes.io/projected/6dac8464-d82e-420d-93f4-d9c73ca4b209-kube-api-access-qts5m\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:17 crc kubenswrapper[4781]: E1202 09:42:17.987212 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 09:42:17 crc kubenswrapper[4781]: E1202 09:42:17.988201 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grhhp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-hcnql_openstack(f188f85e-b1c0-4a87-a057-16324e42a64a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:42:17 crc kubenswrapper[4781]: E1202 09:42:17.989424 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-hcnql" podUID="f188f85e-b1c0-4a87-a057-16324e42a64a" Dec 02 09:42:18 crc kubenswrapper[4781]: I1202 09:42:18.291635 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-rll7p" event={"ID":"6dac8464-d82e-420d-93f4-d9c73ca4b209","Type":"ContainerDied","Data":"35b8cbeb611b4e57f45d9fe5796803b015a02b054200c3fe595802c8f5b8fca2"} Dec 02 09:42:18 crc kubenswrapper[4781]: I1202 09:42:18.291727 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rll7p" Dec 02 09:42:18 crc kubenswrapper[4781]: E1202 09:42:18.293474 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="2fd68438-412a-4745-9e59-f4c9374f2444" Dec 02 09:42:18 crc kubenswrapper[4781]: E1202 09:42:18.293654 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="9ef9f823-7456-4494-85c7-d29fcf35e7b5" Dec 02 09:42:18 crc kubenswrapper[4781]: I1202 09:42:18.396801 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rll7p"] Dec 02 09:42:18 crc kubenswrapper[4781]: I1202 09:42:18.404584 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rll7p"] Dec 02 09:42:19 crc kubenswrapper[4781]: E1202 09:42:19.412462 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified" Dec 02 09:42:19 crc kubenswrapper[4781]: E1202 09:42:19.412949 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ovsdb-server-init,Image:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,Command:[/usr/local/bin/container-scripts/init-ovsdb-server.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n569h658h7fhc5h64ch6ch57ch65dh7h56bh4hb8h4hc8h5dch68h5bch87h59h558hch58bh65ch56bh586hc9h5cbh595h68ch55h649h8fq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-ovs,ReadOnly:false,MountPath:/etc/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log,ReadOnly:false,MountPath:/var/log/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib,ReadOnly:false,MountPath:/var/lib/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nq8x9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-ovs-rh5wv_openstack(77d1da39-d5b0-4ec8-8196-f4f1025291f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:42:19 crc kubenswrapper[4781]: E1202 09:42:19.414072 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-ovs-rh5wv" podUID="77d1da39-d5b0-4ec8-8196-f4f1025291f8" Dec 02 09:42:19 crc kubenswrapper[4781]: I1202 09:42:19.524775 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-hcnql" Dec 02 09:42:19 crc kubenswrapper[4781]: I1202 09:42:19.531279 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dac8464-d82e-420d-93f4-d9c73ca4b209" path="/var/lib/kubelet/pods/6dac8464-d82e-420d-93f4-d9c73ca4b209/volumes" Dec 02 09:42:19 crc kubenswrapper[4781]: I1202 09:42:19.569105 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f188f85e-b1c0-4a87-a057-16324e42a64a-config\") pod \"f188f85e-b1c0-4a87-a057-16324e42a64a\" (UID: \"f188f85e-b1c0-4a87-a057-16324e42a64a\") " Dec 02 09:42:19 crc kubenswrapper[4781]: I1202 09:42:19.569168 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grhhp\" (UniqueName: \"kubernetes.io/projected/f188f85e-b1c0-4a87-a057-16324e42a64a-kube-api-access-grhhp\") pod \"f188f85e-b1c0-4a87-a057-16324e42a64a\" (UID: \"f188f85e-b1c0-4a87-a057-16324e42a64a\") " Dec 02 09:42:19 crc kubenswrapper[4781]: I1202 09:42:19.571238 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f188f85e-b1c0-4a87-a057-16324e42a64a-config" (OuterVolumeSpecName: "config") pod "f188f85e-b1c0-4a87-a057-16324e42a64a" (UID: "f188f85e-b1c0-4a87-a057-16324e42a64a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:19 crc kubenswrapper[4781]: I1202 09:42:19.575746 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f188f85e-b1c0-4a87-a057-16324e42a64a-kube-api-access-grhhp" (OuterVolumeSpecName: "kube-api-access-grhhp") pod "f188f85e-b1c0-4a87-a057-16324e42a64a" (UID: "f188f85e-b1c0-4a87-a057-16324e42a64a"). InnerVolumeSpecName "kube-api-access-grhhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:42:19 crc kubenswrapper[4781]: I1202 09:42:19.681912 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f188f85e-b1c0-4a87-a057-16324e42a64a-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:19 crc kubenswrapper[4781]: I1202 09:42:19.681967 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grhhp\" (UniqueName: \"kubernetes.io/projected/f188f85e-b1c0-4a87-a057-16324e42a64a-kube-api-access-grhhp\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:19 crc kubenswrapper[4781]: I1202 09:42:19.890551 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7c2h7"] Dec 02 09:42:20 crc kubenswrapper[4781]: I1202 09:42:20.044469 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 09:42:20 crc kubenswrapper[4781]: W1202 09:42:20.133671 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dea0cb6_7707_46ba_bd47_89ce579fdad9.slice/crio-64f43f05d130183b77041de2fa7613d6c7bc3bee60ba0bfcafa5e3410127db7d WatchSource:0}: Error finding container 64f43f05d130183b77041de2fa7613d6c7bc3bee60ba0bfcafa5e3410127db7d: Status 404 returned error can't find the container with id 64f43f05d130183b77041de2fa7613d6c7bc3bee60ba0bfcafa5e3410127db7d Dec 02 09:42:20 crc kubenswrapper[4781]: E1202 09:42:20.233312 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Dec 02 09:42:20 crc kubenswrapper[4781]: E1202 09:42:20.233520 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n569h658h7fhc5h64ch6ch57ch65dh7h56bh4hb8h4hc8h5dch68h5bch87h59h558hch58bh65ch56bh586hc9h5cbh595h68ch55h649h8fq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l8w8w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-mmhrv_openstack(7cf924dd-8243-4263-85a2-68ac01fd5346): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:42:20 crc kubenswrapper[4781]: E1202 09:42:20.234916 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-mmhrv" podUID="7cf924dd-8243-4263-85a2-68ac01fd5346" Dec 02 09:42:20 crc kubenswrapper[4781]: I1202 09:42:20.270214 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 09:42:20 crc kubenswrapper[4781]: I1202 09:42:20.308957 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-hcnql" event={"ID":"f188f85e-b1c0-4a87-a057-16324e42a64a","Type":"ContainerDied","Data":"ffe7b06b5dc77b25c98d82a64525e929bca33ba0eb5c162b56beac22b52c6121"} Dec 02 09:42:20 crc kubenswrapper[4781]: I1202 09:42:20.308977 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-hcnql" Dec 02 09:42:20 crc kubenswrapper[4781]: I1202 09:42:20.309729 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6dea0cb6-7707-46ba-bd47-89ce579fdad9","Type":"ContainerStarted","Data":"64f43f05d130183b77041de2fa7613d6c7bc3bee60ba0bfcafa5e3410127db7d"} Dec 02 09:42:20 crc kubenswrapper[4781]: E1202 09:42:20.311171 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-mmhrv" podUID="7cf924dd-8243-4263-85a2-68ac01fd5346" Dec 02 09:42:20 crc kubenswrapper[4781]: E1202 09:42:20.311517 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified\\\"\"" pod="openstack/ovn-controller-ovs-rh5wv" podUID="77d1da39-d5b0-4ec8-8196-f4f1025291f8" Dec 02 09:42:20 crc kubenswrapper[4781]: I1202 09:42:20.406969 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hcnql"] Dec 02 09:42:20 crc kubenswrapper[4781]: I1202 09:42:20.413086 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hcnql"] Dec 02 09:42:20 crc kubenswrapper[4781]: W1202 09:42:20.511038 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bc2040b_e9d5_4a1a_9d46_6b50dbc71061.slice/crio-301add836f98bf899c471a3774c720fc21c75a68944f77f29a64be1625b74e25 WatchSource:0}: Error finding container 301add836f98bf899c471a3774c720fc21c75a68944f77f29a64be1625b74e25: Status 404 returned error can't find the container with id 301add836f98bf899c471a3774c720fc21c75a68944f77f29a64be1625b74e25 Dec 02 09:42:20 crc kubenswrapper[4781]: E1202 09:42:20.611389 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 02 09:42:20 crc kubenswrapper[4781]: E1202 09:42:20.611437 4781 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 02 09:42:20 crc kubenswrapper[4781]: E1202 09:42:20.611555 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ld4pf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(284c6c26-76ca-4800-b40f-51528de0c015): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" logger="UnhandledError" Dec 02 09:42:20 crc kubenswrapper[4781]: E1202 09:42:20.613422 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="284c6c26-76ca-4800-b40f-51528de0c015" Dec 02 09:42:21 crc kubenswrapper[4781]: I1202 09:42:21.317931 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7c2h7" event={"ID":"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061","Type":"ContainerStarted","Data":"301add836f98bf899c471a3774c720fc21c75a68944f77f29a64be1625b74e25"} Dec 02 09:42:21 crc kubenswrapper[4781]: I1202 09:42:21.319895 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"05cec06e-b8e4-487d-b4f8-0691aaf1f997","Type":"ContainerStarted","Data":"996d0e8b32b76a3595e1642d123139ec299feca89a5fade95d4de5401504acdd"} Dec 02 09:42:21 crc kubenswrapper[4781]: I1202 09:42:21.323029 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8c135676-b0d9-469f-82b2-59483c9712f1","Type":"ContainerStarted","Data":"89d91f61ccb7062f9068f0b67915523fb0372c0a79c27d5bd7eb3a29ac9bc839"} Dec 02 09:42:21 crc kubenswrapper[4781]: E1202 09:42:21.325053 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="284c6c26-76ca-4800-b40f-51528de0c015" Dec 02 09:42:21 crc kubenswrapper[4781]: I1202 09:42:21.512043 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f188f85e-b1c0-4a87-a057-16324e42a64a" path="/var/lib/kubelet/pods/f188f85e-b1c0-4a87-a057-16324e42a64a/volumes" Dec 02 09:42:22 crc kubenswrapper[4781]: I1202 09:42:22.335036 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6dea0cb6-7707-46ba-bd47-89ce579fdad9","Type":"ContainerStarted","Data":"6754652ec8e50c39ab92d0c8f009a35c87ebb3b6dc6b736f706d272c506b91fb"} Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.373398 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"39428d69-6b86-41ea-8c4f-5532a5283a91","Type":"ContainerStarted","Data":"3a376840e54cb55a753a13b505816103cc6284d061bfe5b7c6d303f139012a26"} Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.377072 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8c135676-b0d9-469f-82b2-59483c9712f1","Type":"ContainerStarted","Data":"600ee95a9ed3bcaa122101f77e68678c697c58c40f6b431b1c0df1d85532442b"} Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.377104 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8c135676-b0d9-469f-82b2-59483c9712f1","Type":"ContainerStarted","Data":"edf5842087258d86f1a48b07fa790a045498aada5ac9e5e97f009824c6907759"} Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.379735 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7c2h7" event={"ID":"7bc2040b-e9d5-4a1a-9d46-6b50dbc71061","Type":"ContainerStarted","Data":"9f7d1a8c97d4ba4d1c117130217c73079853e38ae497b33e8b736a7ff2ffa445"} Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.433525 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=49.606157077 podStartE2EDuration="55.433507678s" podCreationTimestamp="2025-12-02 09:41:32 +0000 UTC" firstStartedPulling="2025-12-02 09:42:20.527557082 +0000 UTC m=+1303.351430971" lastFinishedPulling="2025-12-02 09:42:26.354907693 +0000 UTC m=+1309.178781572" observedRunningTime="2025-12-02 09:42:27.433451757 +0000 UTC m=+1310.257325636" watchObservedRunningTime="2025-12-02 09:42:27.433507678 +0000 UTC m=+1310.257381547" Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.454666 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-7c2h7" podStartSLOduration=50.546720103 podStartE2EDuration="56.454652069s" podCreationTimestamp="2025-12-02 09:41:31 +0000 UTC" firstStartedPulling="2025-12-02 09:42:20.523472232 +0000 UTC m=+1303.347346121" lastFinishedPulling="2025-12-02 09:42:26.431404208 +0000 UTC m=+1309.255278087" observedRunningTime="2025-12-02 09:42:27.453681992 +0000 UTC m=+1310.277555881" watchObservedRunningTime="2025-12-02 09:42:27.454652069 +0000 UTC m=+1310.278525948" Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.732743 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bk67p"] Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.765557 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-p8r4f"] Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.767297 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.769119 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.777319 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.788732 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-p8r4f"] Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.838009 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj7bz\" (UniqueName: \"kubernetes.io/projected/7c9029e4-b29e-4131-b26a-453a0084edf3-kube-api-access-wj7bz\") pod \"dnsmasq-dns-7fd796d7df-p8r4f\" (UID: \"7c9029e4-b29e-4131-b26a-453a0084edf3\") " pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.838047 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-p8r4f\" (UID: \"7c9029e4-b29e-4131-b26a-453a0084edf3\") " pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.838078 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-p8r4f\" (UID: \"7c9029e4-b29e-4131-b26a-453a0084edf3\") " pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.838116 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-config\") pod \"dnsmasq-dns-7fd796d7df-p8r4f\" (UID: \"7c9029e4-b29e-4131-b26a-453a0084edf3\") " pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.970342 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj7bz\" (UniqueName: \"kubernetes.io/projected/7c9029e4-b29e-4131-b26a-453a0084edf3-kube-api-access-wj7bz\") pod \"dnsmasq-dns-7fd796d7df-p8r4f\" (UID: \"7c9029e4-b29e-4131-b26a-453a0084edf3\") " pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.970646 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-p8r4f\" (UID: \"7c9029e4-b29e-4131-b26a-453a0084edf3\") " pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.970680 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-p8r4f\" (UID: \"7c9029e4-b29e-4131-b26a-453a0084edf3\") " pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.970721 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-config\") pod \"dnsmasq-dns-7fd796d7df-p8r4f\" (UID: \"7c9029e4-b29e-4131-b26a-453a0084edf3\") " pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.972038 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-p8r4f\" (UID: \"7c9029e4-b29e-4131-b26a-453a0084edf3\") " pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.972207 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-p8r4f\" (UID: \"7c9029e4-b29e-4131-b26a-453a0084edf3\") " pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.972744 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-config\") pod \"dnsmasq-dns-7fd796d7df-p8r4f\" (UID: \"7c9029e4-b29e-4131-b26a-453a0084edf3\") " pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:27 crc kubenswrapper[4781]: I1202 09:42:27.995077 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj7bz\" (UniqueName: \"kubernetes.io/projected/7c9029e4-b29e-4131-b26a-453a0084edf3-kube-api-access-wj7bz\") pod \"dnsmasq-dns-7fd796d7df-p8r4f\" (UID: \"7c9029e4-b29e-4131-b26a-453a0084edf3\") " pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.084112 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sz5wn"] Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.108460 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zc9s5"] Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.110047 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.114635 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.127005 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zc9s5"] Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.208559 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.274171 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-config\") pod \"dnsmasq-dns-86db49b7ff-zc9s5\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.274466 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zc9s5\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.274607 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zc9s5\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.274746 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75jdh\" (UniqueName: \"kubernetes.io/projected/244b353d-3185-4eda-9d02-c22719d2a514-kube-api-access-75jdh\") pod \"dnsmasq-dns-86db49b7ff-zc9s5\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.275133 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zc9s5\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.377133 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zc9s5\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.377215 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-config\") pod \"dnsmasq-dns-86db49b7ff-zc9s5\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.377273 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zc9s5\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.377354 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zc9s5\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.377404 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75jdh\" (UniqueName: \"kubernetes.io/projected/244b353d-3185-4eda-9d02-c22719d2a514-kube-api-access-75jdh\") pod \"dnsmasq-dns-86db49b7ff-zc9s5\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.378397 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zc9s5\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.378437 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-config\") pod \"dnsmasq-dns-86db49b7ff-zc9s5\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.378569 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zc9s5\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.379129 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zc9s5\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.391900 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6dea0cb6-7707-46ba-bd47-89ce579fdad9","Type":"ContainerStarted","Data":"2fe1028610b62081d61844d05a42b1776a80a9760ab981ca8fb23d7a374f49ee"} Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.400255 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75jdh\" (UniqueName: \"kubernetes.io/projected/244b353d-3185-4eda-9d02-c22719d2a514-kube-api-access-75jdh\") pod \"dnsmasq-dns-86db49b7ff-zc9s5\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.426658 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=48.978961502 podStartE2EDuration="56.426640956s" podCreationTimestamp="2025-12-02 09:41:32 +0000 UTC" firstStartedPulling="2025-12-02 09:42:20.135893766 +0000 UTC m=+1302.959767645" lastFinishedPulling="2025-12-02 09:42:27.58357322 +0000 UTC m=+1310.407447099" observedRunningTime="2025-12-02 09:42:28.423471641 +0000 UTC m=+1311.247345520" watchObservedRunningTime="2025-12-02 09:42:28.426640956 +0000 UTC m=+1311.250514835" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.451511 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.476051 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.777437 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 02 09:42:28 crc kubenswrapper[4781]: I1202 09:42:28.988084 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-bk67p" Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.090848 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl9cd\" (UniqueName: \"kubernetes.io/projected/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-kube-api-access-wl9cd\") pod \"88e0f2e3-8550-4e01-bbfa-d48d1b5add07\" (UID: \"88e0f2e3-8550-4e01-bbfa-d48d1b5add07\") " Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.091032 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-config\") pod \"88e0f2e3-8550-4e01-bbfa-d48d1b5add07\" (UID: \"88e0f2e3-8550-4e01-bbfa-d48d1b5add07\") " Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.091219 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-dns-svc\") pod \"88e0f2e3-8550-4e01-bbfa-d48d1b5add07\" (UID: \"88e0f2e3-8550-4e01-bbfa-d48d1b5add07\") " Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.091563 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-config" (OuterVolumeSpecName: "config") pod "88e0f2e3-8550-4e01-bbfa-d48d1b5add07" (UID: "88e0f2e3-8550-4e01-bbfa-d48d1b5add07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.091698 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "88e0f2e3-8550-4e01-bbfa-d48d1b5add07" (UID: "88e0f2e3-8550-4e01-bbfa-d48d1b5add07"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.092429 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.092454 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.099952 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-kube-api-access-wl9cd" (OuterVolumeSpecName: "kube-api-access-wl9cd") pod "88e0f2e3-8550-4e01-bbfa-d48d1b5add07" (UID: "88e0f2e3-8550-4e01-bbfa-d48d1b5add07"). InnerVolumeSpecName "kube-api-access-wl9cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.193393 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl9cd\" (UniqueName: \"kubernetes.io/projected/88e0f2e3-8550-4e01-bbfa-d48d1b5add07-kube-api-access-wl9cd\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.214884 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zc9s5"] Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.328847 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-p8r4f"] Dec 02 09:42:29 crc kubenswrapper[4781]: W1202 09:42:29.381582 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c9029e4_b29e_4131_b26a_453a0084edf3.slice/crio-77a21a08d1acd3ad6785b2072829562b57461e47d8ea754eea39a9560d8ac9f9 WatchSource:0}: Error finding container 77a21a08d1acd3ad6785b2072829562b57461e47d8ea754eea39a9560d8ac9f9: Status 404 returned error can't find the container with id 77a21a08d1acd3ad6785b2072829562b57461e47d8ea754eea39a9560d8ac9f9 Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.405581 4781 generic.go:334] "Generic (PLEG): container finished" podID="05cec06e-b8e4-487d-b4f8-0691aaf1f997" containerID="996d0e8b32b76a3595e1642d123139ec299feca89a5fade95d4de5401504acdd" exitCode=0 Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.405670 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"05cec06e-b8e4-487d-b4f8-0691aaf1f997","Type":"ContainerDied","Data":"996d0e8b32b76a3595e1642d123139ec299feca89a5fade95d4de5401504acdd"} Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.406917 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" event={"ID":"7c9029e4-b29e-4131-b26a-453a0084edf3","Type":"ContainerStarted","Data":"77a21a08d1acd3ad6785b2072829562b57461e47d8ea754eea39a9560d8ac9f9"} Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.409532 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" event={"ID":"244b353d-3185-4eda-9d02-c22719d2a514","Type":"ContainerStarted","Data":"ef1053fd063e5b924a6b44aab0d9c795ee94443263aee809025fa424b37cf93b"} Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.411381 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-bk67p" event={"ID":"88e0f2e3-8550-4e01-bbfa-d48d1b5add07","Type":"ContainerDied","Data":"73f2dbda8c45276c80ba198242c11f5058e6e7ed7bb45a486d0b1630bb056240"} Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.411695 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-bk67p" Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.612745 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bk67p"] Dec 02 09:42:29 crc kubenswrapper[4781]: I1202 09:42:29.620098 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bk67p"] Dec 02 09:42:30 crc kubenswrapper[4781]: I1202 09:42:30.476089 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 02 09:42:30 crc kubenswrapper[4781]: I1202 09:42:30.529993 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 02 09:42:30 crc kubenswrapper[4781]: I1202 09:42:30.825440 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.429190 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"05cec06e-b8e4-487d-b4f8-0691aaf1f997","Type":"ContainerStarted","Data":"34ca64ed72acff9e34fa686721400f0a8615b0ae49df2dbb8f455f0d30c5c22d"} Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.432022 4781 generic.go:334] "Generic (PLEG): container finished" podID="f80e657e-6f81-4250-90e9-c63f77fb03b2" containerID="625d74ef93928395d04d33165f7e2bd9ee97bf21328d40c2d61e3aa65740063b" exitCode=0 Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.432010 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" event={"ID":"f80e657e-6f81-4250-90e9-c63f77fb03b2","Type":"ContainerDied","Data":"625d74ef93928395d04d33165f7e2bd9ee97bf21328d40c2d61e3aa65740063b"} Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.438164 4781 generic.go:334] "Generic (PLEG): container finished" podID="7c9029e4-b29e-4131-b26a-453a0084edf3" containerID="1b5dc96730714e6ebf5994af23e743dc9ee4b4e7827d2840b9de41f8167e111f" exitCode=0 Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.440076 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" event={"ID":"7c9029e4-b29e-4131-b26a-453a0084edf3","Type":"ContainerDied","Data":"1b5dc96730714e6ebf5994af23e743dc9ee4b4e7827d2840b9de41f8167e111f"} Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.459408 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=13.161753488 podStartE2EDuration="1m9.459387322s" podCreationTimestamp="2025-12-02 09:41:22 +0000 UTC" firstStartedPulling="2025-12-02 09:41:24.305061277 +0000 UTC m=+1247.128935156" lastFinishedPulling="2025-12-02 09:42:20.602695111 +0000 UTC m=+1303.426568990" observedRunningTime="2025-12-02 09:42:31.454750667 +0000 UTC m=+1314.278624546" watchObservedRunningTime="2025-12-02 09:42:31.459387322 +0000 UTC m=+1314.283261201" Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.502232 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.502539 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.517338 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88e0f2e3-8550-4e01-bbfa-d48d1b5add07" path="/var/lib/kubelet/pods/88e0f2e3-8550-4e01-bbfa-d48d1b5add07/volumes" Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.826423 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.876665 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 02 09:42:31 crc kubenswrapper[4781]: E1202 09:42:31.877096 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80e657e-6f81-4250-90e9-c63f77fb03b2" containerName="init" Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.877120 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80e657e-6f81-4250-90e9-c63f77fb03b2" containerName="init" Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.877355 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f80e657e-6f81-4250-90e9-c63f77fb03b2" containerName="init" Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.879242 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.881993 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.882164 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.883785 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-8n56g" Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.887574 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.890829 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.939614 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80e657e-6f81-4250-90e9-c63f77fb03b2-config\") pod \"f80e657e-6f81-4250-90e9-c63f77fb03b2\" (UID: \"f80e657e-6f81-4250-90e9-c63f77fb03b2\") " Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.939720 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f80e657e-6f81-4250-90e9-c63f77fb03b2-dns-svc\") pod \"f80e657e-6f81-4250-90e9-c63f77fb03b2\" (UID: \"f80e657e-6f81-4250-90e9-c63f77fb03b2\") " Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.939858 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr4jt\" (UniqueName: \"kubernetes.io/projected/f80e657e-6f81-4250-90e9-c63f77fb03b2-kube-api-access-vr4jt\") pod \"f80e657e-6f81-4250-90e9-c63f77fb03b2\" (UID: \"f80e657e-6f81-4250-90e9-c63f77fb03b2\") " Dec 02 09:42:31 crc kubenswrapper[4781]: I1202 09:42:31.960427 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f80e657e-6f81-4250-90e9-c63f77fb03b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f80e657e-6f81-4250-90e9-c63f77fb03b2" (UID: "f80e657e-6f81-4250-90e9-c63f77fb03b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.041882 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d0908af-30ab-4017-8911-b10c3742336e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.042531 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d0908af-30ab-4017-8911-b10c3742336e-scripts\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.042650 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d0908af-30ab-4017-8911-b10c3742336e-config\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.042721 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d0908af-30ab-4017-8911-b10c3742336e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.042758 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d0908af-30ab-4017-8911-b10c3742336e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.042830 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d0908af-30ab-4017-8911-b10c3742336e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.042877 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txf2h\" (UniqueName: \"kubernetes.io/projected/1d0908af-30ab-4017-8911-b10c3742336e-kube-api-access-txf2h\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.042953 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f80e657e-6f81-4250-90e9-c63f77fb03b2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.073731 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f80e657e-6f81-4250-90e9-c63f77fb03b2-kube-api-access-vr4jt" (OuterVolumeSpecName: "kube-api-access-vr4jt") pod "f80e657e-6f81-4250-90e9-c63f77fb03b2" (UID: "f80e657e-6f81-4250-90e9-c63f77fb03b2"). InnerVolumeSpecName "kube-api-access-vr4jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.144154 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d0908af-30ab-4017-8911-b10c3742336e-scripts\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.144225 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d0908af-30ab-4017-8911-b10c3742336e-config\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.144256 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d0908af-30ab-4017-8911-b10c3742336e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.144278 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d0908af-30ab-4017-8911-b10c3742336e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.144306 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d0908af-30ab-4017-8911-b10c3742336e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.144335 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txf2h\" (UniqueName: \"kubernetes.io/projected/1d0908af-30ab-4017-8911-b10c3742336e-kube-api-access-txf2h\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.144386 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d0908af-30ab-4017-8911-b10c3742336e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.144449 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr4jt\" (UniqueName: \"kubernetes.io/projected/f80e657e-6f81-4250-90e9-c63f77fb03b2-kube-api-access-vr4jt\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.145158 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d0908af-30ab-4017-8911-b10c3742336e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.145453 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d0908af-30ab-4017-8911-b10c3742336e-scripts\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.145574 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d0908af-30ab-4017-8911-b10c3742336e-config\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.148825 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d0908af-30ab-4017-8911-b10c3742336e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.148900 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d0908af-30ab-4017-8911-b10c3742336e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.149102 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d0908af-30ab-4017-8911-b10c3742336e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.162156 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txf2h\" (UniqueName: \"kubernetes.io/projected/1d0908af-30ab-4017-8911-b10c3742336e-kube-api-access-txf2h\") pod \"ovn-northd-0\" (UID: \"1d0908af-30ab-4017-8911-b10c3742336e\") " pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.217894 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.445747 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" event={"ID":"f80e657e-6f81-4250-90e9-c63f77fb03b2","Type":"ContainerDied","Data":"d48faf89f7c0c42ae72a1a4ac7132b2d5ec6ddf6c6cf7a94dfc12e33a50c4b8e"} Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.445801 4781 scope.go:117] "RemoveContainer" containerID="625d74ef93928395d04d33165f7e2bd9ee97bf21328d40c2d61e3aa65740063b" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.445953 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-sz5wn" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.449763 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" event={"ID":"7c9029e4-b29e-4131-b26a-453a0084edf3","Type":"ContainerStarted","Data":"9ddfbeec642e10f6a5c039ec79e6bb5d89e7e353c0272d77c41b42c445fbc194"} Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.450314 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.454858 4781 generic.go:334] "Generic (PLEG): container finished" podID="244b353d-3185-4eda-9d02-c22719d2a514" containerID="f056c08ca7e6fd30a7309773bf67db627a209f33f234b9349472728f3406a296" exitCode=0 Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.454905 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" event={"ID":"244b353d-3185-4eda-9d02-c22719d2a514","Type":"ContainerDied","Data":"f056c08ca7e6fd30a7309773bf67db627a209f33f234b9349472728f3406a296"} Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.480992 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" podStartSLOduration=3.913798558 podStartE2EDuration="5.480971508s" podCreationTimestamp="2025-12-02 09:42:27 +0000 UTC" firstStartedPulling="2025-12-02 09:42:29.383021052 +0000 UTC m=+1312.206894931" lastFinishedPulling="2025-12-02 09:42:30.950193992 +0000 UTC m=+1313.774067881" observedRunningTime="2025-12-02 09:42:32.473829456 +0000 UTC m=+1315.297703355" watchObservedRunningTime="2025-12-02 09:42:32.480971508 +0000 UTC m=+1315.304845387" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.736081 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 09:42:32 crc kubenswrapper[4781]: W1202 09:42:32.743447 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d0908af_30ab_4017_8911_b10c3742336e.slice/crio-3644cbef51211c73038541312ed3cd49f4d18787705d46dfe8891407426d83aa WatchSource:0}: Error finding container 3644cbef51211c73038541312ed3cd49f4d18787705d46dfe8891407426d83aa: Status 404 returned error can't find the container with id 3644cbef51211c73038541312ed3cd49f4d18787705d46dfe8891407426d83aa Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.777616 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f80e657e-6f81-4250-90e9-c63f77fb03b2-config" (OuterVolumeSpecName: "config") pod "f80e657e-6f81-4250-90e9-c63f77fb03b2" (UID: "f80e657e-6f81-4250-90e9-c63f77fb03b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:32 crc kubenswrapper[4781]: I1202 09:42:32.860684 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80e657e-6f81-4250-90e9-c63f77fb03b2-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:33 crc kubenswrapper[4781]: I1202 09:42:33.115387 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sz5wn"] Dec 02 09:42:33 crc kubenswrapper[4781]: I1202 09:42:33.123579 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sz5wn"] Dec 02 09:42:33 crc kubenswrapper[4781]: I1202 09:42:33.398277 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 02 09:42:33 crc kubenswrapper[4781]: I1202 09:42:33.398393 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 02 09:42:33 crc kubenswrapper[4781]: I1202 09:42:33.463391 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9ef9f823-7456-4494-85c7-d29fcf35e7b5","Type":"ContainerStarted","Data":"cd33af3c334721c99ee5f6eebb56a79b04c6781e3be5fde4010c62024eb59205"} Dec 02 09:42:33 crc kubenswrapper[4781]: I1202 09:42:33.464546 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4d040259-d968-45a1-832a-45586a9fe0d1","Type":"ContainerStarted","Data":"b659230fc0458ff9201681c08c775d787b2461d03f85fed77a3b501a5314c78e"} Dec 02 09:42:33 crc kubenswrapper[4781]: I1202 09:42:33.465938 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" event={"ID":"244b353d-3185-4eda-9d02-c22719d2a514","Type":"ContainerStarted","Data":"c08dfa216a14f1b5f92faf7d6e1d93ef4a950f425199cd650ab41ec13cb5de98"} Dec 02 09:42:33 crc kubenswrapper[4781]: I1202 09:42:33.466035 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:33 crc kubenswrapper[4781]: I1202 09:42:33.466897 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rh5wv" event={"ID":"77d1da39-d5b0-4ec8-8196-f4f1025291f8","Type":"ContainerStarted","Data":"9f5dba0ae9c33937d03ca28de66c6bd077cf2b35d827f2992673ec2f9f10fbf5"} Dec 02 09:42:33 crc kubenswrapper[4781]: I1202 09:42:33.467706 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1d0908af-30ab-4017-8911-b10c3742336e","Type":"ContainerStarted","Data":"3644cbef51211c73038541312ed3cd49f4d18787705d46dfe8891407426d83aa"} Dec 02 09:42:33 crc kubenswrapper[4781]: I1202 09:42:33.513167 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f80e657e-6f81-4250-90e9-c63f77fb03b2" path="/var/lib/kubelet/pods/f80e657e-6f81-4250-90e9-c63f77fb03b2/volumes" Dec 02 09:42:33 crc kubenswrapper[4781]: I1202 09:42:33.549703 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" podStartSLOduration=3.496788391 podStartE2EDuration="5.549684177s" podCreationTimestamp="2025-12-02 09:42:28 +0000 UTC" firstStartedPulling="2025-12-02 09:42:29.225093968 +0000 UTC m=+1312.048967847" lastFinishedPulling="2025-12-02 09:42:31.277989754 +0000 UTC m=+1314.101863633" observedRunningTime="2025-12-02 09:42:33.541494866 +0000 UTC m=+1316.365368745" watchObservedRunningTime="2025-12-02 09:42:33.549684177 +0000 UTC m=+1316.373558046" Dec 02 09:42:34 crc kubenswrapper[4781]: E1202 09:42:34.456907 4781 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.194:45004->38.102.83.194:45901: write tcp 38.102.83.194:45004->38.102.83.194:45901: write: broken pipe Dec 02 09:42:34 crc kubenswrapper[4781]: I1202 09:42:34.481049 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2fd68438-412a-4745-9e59-f4c9374f2444","Type":"ContainerStarted","Data":"b741814fe2ff9e67416ba18449aa5eedd4c3fe3ca59d1dbd82c71a9744bdec62"} Dec 02 09:42:34 crc kubenswrapper[4781]: I1202 09:42:34.482171 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 02 09:42:34 crc kubenswrapper[4781]: I1202 09:42:34.486639 4781 generic.go:334] "Generic (PLEG): container finished" podID="39428d69-6b86-41ea-8c4f-5532a5283a91" containerID="3a376840e54cb55a753a13b505816103cc6284d061bfe5b7c6d303f139012a26" exitCode=0 Dec 02 09:42:34 crc kubenswrapper[4781]: I1202 09:42:34.486691 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"39428d69-6b86-41ea-8c4f-5532a5283a91","Type":"ContainerDied","Data":"3a376840e54cb55a753a13b505816103cc6284d061bfe5b7c6d303f139012a26"} Dec 02 09:42:34 crc kubenswrapper[4781]: I1202 09:42:34.489660 4781 generic.go:334] "Generic (PLEG): container finished" podID="77d1da39-d5b0-4ec8-8196-f4f1025291f8" containerID="9f5dba0ae9c33937d03ca28de66c6bd077cf2b35d827f2992673ec2f9f10fbf5" exitCode=0 Dec 02 09:42:34 crc kubenswrapper[4781]: I1202 09:42:34.489751 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rh5wv" event={"ID":"77d1da39-d5b0-4ec8-8196-f4f1025291f8","Type":"ContainerDied","Data":"9f5dba0ae9c33937d03ca28de66c6bd077cf2b35d827f2992673ec2f9f10fbf5"} Dec 02 09:42:34 crc kubenswrapper[4781]: I1202 09:42:34.492636 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1d0908af-30ab-4017-8911-b10c3742336e","Type":"ContainerStarted","Data":"5c447625adcd4fc85537c40d3b01ad3ee62f583bae317404e82320024a7335f5"} Dec 02 09:42:34 crc kubenswrapper[4781]: I1202 09:42:34.504009 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.745977688 podStartE2EDuration="1m11.503989727s" podCreationTimestamp="2025-12-02 09:41:23 +0000 UTC" firstStartedPulling="2025-12-02 09:41:24.675521432 +0000 UTC m=+1247.499395311" lastFinishedPulling="2025-12-02 09:42:33.433533471 +0000 UTC m=+1316.257407350" observedRunningTime="2025-12-02 09:42:34.49741819 +0000 UTC m=+1317.321292069" watchObservedRunningTime="2025-12-02 09:42:34.503989727 +0000 UTC m=+1317.327863606" Dec 02 09:42:35 crc kubenswrapper[4781]: I1202 09:42:35.772722 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 02 09:42:35 crc kubenswrapper[4781]: I1202 09:42:35.871286 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 02 09:42:36 crc kubenswrapper[4781]: I1202 09:42:36.511147 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rh5wv" event={"ID":"77d1da39-d5b0-4ec8-8196-f4f1025291f8","Type":"ContainerStarted","Data":"91ee44906e22ce2e70e7a68717366106ae737e656d5834bae89e80909bc088f0"} Dec 02 09:42:36 crc kubenswrapper[4781]: I1202 09:42:36.511452 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rh5wv" event={"ID":"77d1da39-d5b0-4ec8-8196-f4f1025291f8","Type":"ContainerStarted","Data":"356ac68186b94a9b09830970e4ec3cc6b29a8ea9aab7beb568a276277b2fbac1"} Dec 02 09:42:36 crc kubenswrapper[4781]: I1202 09:42:36.512619 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:42:36 crc kubenswrapper[4781]: I1202 09:42:36.512650 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:42:36 crc kubenswrapper[4781]: I1202 09:42:36.520774 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1d0908af-30ab-4017-8911-b10c3742336e","Type":"ContainerStarted","Data":"ef5b9781058fe39331238702473e0aa68356981943a703dff2733f878b301c98"} Dec 02 09:42:36 crc kubenswrapper[4781]: I1202 09:42:36.520930 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 02 09:42:36 crc kubenswrapper[4781]: I1202 09:42:36.522354 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"284c6c26-76ca-4800-b40f-51528de0c015","Type":"ContainerStarted","Data":"a167a50db94270b190aadd67c5a28422c38e064f8ad600d65560cd199860b782"} Dec 02 09:42:36 crc kubenswrapper[4781]: I1202 09:42:36.522544 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 02 09:42:36 crc kubenswrapper[4781]: I1202 09:42:36.524182 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"39428d69-6b86-41ea-8c4f-5532a5283a91","Type":"ContainerStarted","Data":"b968f7d25f2d8a06499cb766cbd8b53d1d6f53e34ac6f15e56734c59f5ccb3f5"} Dec 02 09:42:36 crc kubenswrapper[4781]: I1202 09:42:36.526200 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mmhrv" event={"ID":"7cf924dd-8243-4263-85a2-68ac01fd5346","Type":"ContainerStarted","Data":"c7d8c189c9edfb1569818a9750a6acf6e42c9dda24ce489f21eb3d66b00b4705"} Dec 02 09:42:36 crc kubenswrapper[4781]: I1202 09:42:36.526526 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-mmhrv" Dec 02 09:42:36 crc kubenswrapper[4781]: I1202 09:42:36.545269 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-rh5wv" podStartSLOduration=16.590493824 podStartE2EDuration="1m7.545244759s" podCreationTimestamp="2025-12-02 09:41:29 +0000 UTC" firstStartedPulling="2025-12-02 09:41:42.075522056 +0000 UTC m=+1264.899395945" lastFinishedPulling="2025-12-02 09:42:33.030273001 +0000 UTC m=+1315.854146880" observedRunningTime="2025-12-02 09:42:36.542843884 +0000 UTC m=+1319.366717773" watchObservedRunningTime="2025-12-02 09:42:36.545244759 +0000 UTC m=+1319.369118638" Dec 02 09:42:36 crc kubenswrapper[4781]: I1202 09:42:36.569096 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371960.285711 podStartE2EDuration="1m16.569063652s" podCreationTimestamp="2025-12-02 09:41:20 +0000 UTC" firstStartedPulling="2025-12-02 09:41:22.579286215 +0000 UTC m=+1245.403160094" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:42:36.56124187 +0000 UTC m=+1319.385115749" watchObservedRunningTime="2025-12-02 09:42:36.569063652 +0000 UTC m=+1319.392937531" Dec 02 09:42:36 crc kubenswrapper[4781]: I1202 09:42:36.584072 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=4.112079447 podStartE2EDuration="5.584052066s" podCreationTimestamp="2025-12-02 09:42:31 +0000 UTC" firstStartedPulling="2025-12-02 09:42:32.747084014 +0000 UTC m=+1315.570957893" lastFinishedPulling="2025-12-02 09:42:34.219056633 +0000 UTC m=+1317.042930512" observedRunningTime="2025-12-02 09:42:36.57861338 +0000 UTC m=+1319.402487279" watchObservedRunningTime="2025-12-02 09:42:36.584052066 +0000 UTC m=+1319.407925945" Dec 02 09:42:36 crc kubenswrapper[4781]: I1202 09:42:36.605261 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-mmhrv" podStartSLOduration=15.298025843 podStartE2EDuration="1m7.605234489s" podCreationTimestamp="2025-12-02 09:41:29 +0000 UTC" firstStartedPulling="2025-12-02 09:41:42.075417214 +0000 UTC m=+1264.899291093" lastFinishedPulling="2025-12-02 09:42:34.38262586 +0000 UTC m=+1317.206499739" observedRunningTime="2025-12-02 09:42:36.597379217 +0000 UTC m=+1319.421253106" watchObservedRunningTime="2025-12-02 09:42:36.605234489 +0000 UTC m=+1319.429108368" Dec 02 09:42:36 crc kubenswrapper[4781]: I1202 09:42:36.613670 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.236031429 podStartE2EDuration="1m11.613651676s" podCreationTimestamp="2025-12-02 09:41:25 +0000 UTC" firstStartedPulling="2025-12-02 09:41:27.166524448 +0000 UTC m=+1249.990398327" lastFinishedPulling="2025-12-02 09:42:35.544144695 +0000 UTC m=+1318.368018574" observedRunningTime="2025-12-02 09:42:36.612154466 +0000 UTC m=+1319.436028355" watchObservedRunningTime="2025-12-02 09:42:36.613651676 +0000 UTC m=+1319.437525555" Dec 02 09:42:38 crc kubenswrapper[4781]: I1202 09:42:38.210083 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:38 crc kubenswrapper[4781]: I1202 09:42:38.452859 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:38 crc kubenswrapper[4781]: I1202 09:42:38.497662 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-p8r4f"] Dec 02 09:42:38 crc kubenswrapper[4781]: I1202 09:42:38.540737 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" podUID="7c9029e4-b29e-4131-b26a-453a0084edf3" containerName="dnsmasq-dns" containerID="cri-o://9ddfbeec642e10f6a5c039ec79e6bb5d89e7e353c0272d77c41b42c445fbc194" gracePeriod=10 Dec 02 09:42:38 crc kubenswrapper[4781]: I1202 09:42:38.855934 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.159804 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.267834 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-ovsdbserver-nb\") pod \"7c9029e4-b29e-4131-b26a-453a0084edf3\" (UID: \"7c9029e4-b29e-4131-b26a-453a0084edf3\") " Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.268713 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-config\") pod \"7c9029e4-b29e-4131-b26a-453a0084edf3\" (UID: \"7c9029e4-b29e-4131-b26a-453a0084edf3\") " Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.268839 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj7bz\" (UniqueName: \"kubernetes.io/projected/7c9029e4-b29e-4131-b26a-453a0084edf3-kube-api-access-wj7bz\") pod \"7c9029e4-b29e-4131-b26a-453a0084edf3\" (UID: \"7c9029e4-b29e-4131-b26a-453a0084edf3\") " Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.268949 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-dns-svc\") pod \"7c9029e4-b29e-4131-b26a-453a0084edf3\" (UID: \"7c9029e4-b29e-4131-b26a-453a0084edf3\") " Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.273518 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c9029e4-b29e-4131-b26a-453a0084edf3-kube-api-access-wj7bz" (OuterVolumeSpecName: "kube-api-access-wj7bz") pod "7c9029e4-b29e-4131-b26a-453a0084edf3" (UID: "7c9029e4-b29e-4131-b26a-453a0084edf3"). InnerVolumeSpecName "kube-api-access-wj7bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.308223 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7c9029e4-b29e-4131-b26a-453a0084edf3" (UID: "7c9029e4-b29e-4131-b26a-453a0084edf3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.308283 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7c9029e4-b29e-4131-b26a-453a0084edf3" (UID: "7c9029e4-b29e-4131-b26a-453a0084edf3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.310587 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-config" (OuterVolumeSpecName: "config") pod "7c9029e4-b29e-4131-b26a-453a0084edf3" (UID: "7c9029e4-b29e-4131-b26a-453a0084edf3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.370889 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wj7bz\" (UniqueName: \"kubernetes.io/projected/7c9029e4-b29e-4131-b26a-453a0084edf3-kube-api-access-wj7bz\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.370956 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.370973 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.370984 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c9029e4-b29e-4131-b26a-453a0084edf3-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.556110 4781 generic.go:334] "Generic (PLEG): container finished" podID="7c9029e4-b29e-4131-b26a-453a0084edf3" containerID="9ddfbeec642e10f6a5c039ec79e6bb5d89e7e353c0272d77c41b42c445fbc194" exitCode=0 Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.556181 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" event={"ID":"7c9029e4-b29e-4131-b26a-453a0084edf3","Type":"ContainerDied","Data":"9ddfbeec642e10f6a5c039ec79e6bb5d89e7e353c0272d77c41b42c445fbc194"} Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.556216 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" event={"ID":"7c9029e4-b29e-4131-b26a-453a0084edf3","Type":"ContainerDied","Data":"77a21a08d1acd3ad6785b2072829562b57461e47d8ea754eea39a9560d8ac9f9"} Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.556269 4781 scope.go:117] "RemoveContainer" containerID="9ddfbeec642e10f6a5c039ec79e6bb5d89e7e353c0272d77c41b42c445fbc194" Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.556538 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-p8r4f" Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.586426 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-p8r4f"] Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.594555 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-p8r4f"] Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.604143 4781 scope.go:117] "RemoveContainer" containerID="1b5dc96730714e6ebf5994af23e743dc9ee4b4e7827d2840b9de41f8167e111f" Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.656574 4781 scope.go:117] "RemoveContainer" containerID="9ddfbeec642e10f6a5c039ec79e6bb5d89e7e353c0272d77c41b42c445fbc194" Dec 02 09:42:39 crc kubenswrapper[4781]: E1202 09:42:39.657165 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ddfbeec642e10f6a5c039ec79e6bb5d89e7e353c0272d77c41b42c445fbc194\": container with ID starting with 9ddfbeec642e10f6a5c039ec79e6bb5d89e7e353c0272d77c41b42c445fbc194 not found: ID does not exist" containerID="9ddfbeec642e10f6a5c039ec79e6bb5d89e7e353c0272d77c41b42c445fbc194" Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.657326 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ddfbeec642e10f6a5c039ec79e6bb5d89e7e353c0272d77c41b42c445fbc194"} err="failed to get container status \"9ddfbeec642e10f6a5c039ec79e6bb5d89e7e353c0272d77c41b42c445fbc194\": rpc error: code = NotFound desc = could not find container \"9ddfbeec642e10f6a5c039ec79e6bb5d89e7e353c0272d77c41b42c445fbc194\": container with ID starting with 9ddfbeec642e10f6a5c039ec79e6bb5d89e7e353c0272d77c41b42c445fbc194 not found: ID does not exist" Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.657441 4781 scope.go:117] "RemoveContainer" containerID="1b5dc96730714e6ebf5994af23e743dc9ee4b4e7827d2840b9de41f8167e111f" Dec 02 09:42:39 crc kubenswrapper[4781]: E1202 09:42:39.657771 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b5dc96730714e6ebf5994af23e743dc9ee4b4e7827d2840b9de41f8167e111f\": container with ID starting with 1b5dc96730714e6ebf5994af23e743dc9ee4b4e7827d2840b9de41f8167e111f not found: ID does not exist" containerID="1b5dc96730714e6ebf5994af23e743dc9ee4b4e7827d2840b9de41f8167e111f" Dec 02 09:42:39 crc kubenswrapper[4781]: I1202 09:42:39.657864 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b5dc96730714e6ebf5994af23e743dc9ee4b4e7827d2840b9de41f8167e111f"} err="failed to get container status \"1b5dc96730714e6ebf5994af23e743dc9ee4b4e7827d2840b9de41f8167e111f\": rpc error: code = NotFound desc = could not find container \"1b5dc96730714e6ebf5994af23e743dc9ee4b4e7827d2840b9de41f8167e111f\": container with ID starting with 1b5dc96730714e6ebf5994af23e743dc9ee4b4e7827d2840b9de41f8167e111f not found: ID does not exist" Dec 02 09:42:41 crc kubenswrapper[4781]: I1202 09:42:41.514285 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c9029e4-b29e-4131-b26a-453a0084edf3" path="/var/lib/kubelet/pods/7c9029e4-b29e-4131-b26a-453a0084edf3/volumes" Dec 02 09:42:42 crc kubenswrapper[4781]: I1202 09:42:42.219332 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 02 09:42:42 crc kubenswrapper[4781]: I1202 09:42:42.219816 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 02 09:42:42 crc kubenswrapper[4781]: I1202 09:42:42.290415 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 02 09:42:42 crc kubenswrapper[4781]: I1202 09:42:42.649246 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.427546 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b513-account-create-update-pn5wp"] Dec 02 09:42:43 crc kubenswrapper[4781]: E1202 09:42:43.427865 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c9029e4-b29e-4131-b26a-453a0084edf3" containerName="dnsmasq-dns" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.427880 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c9029e4-b29e-4131-b26a-453a0084edf3" containerName="dnsmasq-dns" Dec 02 09:42:43 crc kubenswrapper[4781]: E1202 09:42:43.427895 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c9029e4-b29e-4131-b26a-453a0084edf3" containerName="init" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.427901 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c9029e4-b29e-4131-b26a-453a0084edf3" containerName="init" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.428068 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c9029e4-b29e-4131-b26a-453a0084edf3" containerName="dnsmasq-dns" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.428841 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b513-account-create-update-pn5wp" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.430799 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.444814 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b513-account-create-update-pn5wp"] Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.476918 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-8528x"] Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.477866 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8528x" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.495417 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8528x"] Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.537457 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbqzp\" (UniqueName: \"kubernetes.io/projected/43441d30-70a7-429c-a674-51db8c4e8111-kube-api-access-xbqzp\") pod \"keystone-db-create-8528x\" (UID: \"43441d30-70a7-429c-a674-51db8c4e8111\") " pod="openstack/keystone-db-create-8528x" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.537505 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tc9x\" (UniqueName: \"kubernetes.io/projected/4686994b-2137-4fbc-b470-32bfc5296f79-kube-api-access-8tc9x\") pod \"keystone-b513-account-create-update-pn5wp\" (UID: \"4686994b-2137-4fbc-b470-32bfc5296f79\") " pod="openstack/keystone-b513-account-create-update-pn5wp" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.537602 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4686994b-2137-4fbc-b470-32bfc5296f79-operator-scripts\") pod \"keystone-b513-account-create-update-pn5wp\" (UID: \"4686994b-2137-4fbc-b470-32bfc5296f79\") " pod="openstack/keystone-b513-account-create-update-pn5wp" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.537654 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43441d30-70a7-429c-a674-51db8c4e8111-operator-scripts\") pod \"keystone-db-create-8528x\" (UID: \"43441d30-70a7-429c-a674-51db8c4e8111\") " pod="openstack/keystone-db-create-8528x" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.639379 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbqzp\" (UniqueName: \"kubernetes.io/projected/43441d30-70a7-429c-a674-51db8c4e8111-kube-api-access-xbqzp\") pod \"keystone-db-create-8528x\" (UID: \"43441d30-70a7-429c-a674-51db8c4e8111\") " pod="openstack/keystone-db-create-8528x" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.639438 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tc9x\" (UniqueName: \"kubernetes.io/projected/4686994b-2137-4fbc-b470-32bfc5296f79-kube-api-access-8tc9x\") pod \"keystone-b513-account-create-update-pn5wp\" (UID: \"4686994b-2137-4fbc-b470-32bfc5296f79\") " pod="openstack/keystone-b513-account-create-update-pn5wp" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.639608 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4686994b-2137-4fbc-b470-32bfc5296f79-operator-scripts\") pod \"keystone-b513-account-create-update-pn5wp\" (UID: \"4686994b-2137-4fbc-b470-32bfc5296f79\") " pod="openstack/keystone-b513-account-create-update-pn5wp" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.639666 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43441d30-70a7-429c-a674-51db8c4e8111-operator-scripts\") pod \"keystone-db-create-8528x\" (UID: \"43441d30-70a7-429c-a674-51db8c4e8111\") " pod="openstack/keystone-db-create-8528x" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.640999 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4686994b-2137-4fbc-b470-32bfc5296f79-operator-scripts\") pod \"keystone-b513-account-create-update-pn5wp\" (UID: \"4686994b-2137-4fbc-b470-32bfc5296f79\") " pod="openstack/keystone-b513-account-create-update-pn5wp" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.641001 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43441d30-70a7-429c-a674-51db8c4e8111-operator-scripts\") pod \"keystone-db-create-8528x\" (UID: \"43441d30-70a7-429c-a674-51db8c4e8111\") " pod="openstack/keystone-db-create-8528x" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.668658 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tc9x\" (UniqueName: \"kubernetes.io/projected/4686994b-2137-4fbc-b470-32bfc5296f79-kube-api-access-8tc9x\") pod \"keystone-b513-account-create-update-pn5wp\" (UID: \"4686994b-2137-4fbc-b470-32bfc5296f79\") " pod="openstack/keystone-b513-account-create-update-pn5wp" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.669175 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbqzp\" (UniqueName: \"kubernetes.io/projected/43441d30-70a7-429c-a674-51db8c4e8111-kube-api-access-xbqzp\") pod \"keystone-db-create-8528x\" (UID: \"43441d30-70a7-429c-a674-51db8c4e8111\") " pod="openstack/keystone-db-create-8528x" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.694174 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-2nd8t"] Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.695249 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2nd8t" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.716647 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2nd8t"] Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.743003 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f710b218-cbee-44a5-b8f3-80b4b2210cd8-operator-scripts\") pod \"placement-db-create-2nd8t\" (UID: \"f710b218-cbee-44a5-b8f3-80b4b2210cd8\") " pod="openstack/placement-db-create-2nd8t" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.743084 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pbpq\" (UniqueName: \"kubernetes.io/projected/f710b218-cbee-44a5-b8f3-80b4b2210cd8-kube-api-access-5pbpq\") pod \"placement-db-create-2nd8t\" (UID: \"f710b218-cbee-44a5-b8f3-80b4b2210cd8\") " pod="openstack/placement-db-create-2nd8t" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.749508 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b513-account-create-update-pn5wp" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.753902 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8211-account-create-update-gpv69"] Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.755177 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8211-account-create-update-gpv69" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.759873 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.773451 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8211-account-create-update-gpv69"] Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.808942 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8528x" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.845533 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f710b218-cbee-44a5-b8f3-80b4b2210cd8-operator-scripts\") pod \"placement-db-create-2nd8t\" (UID: \"f710b218-cbee-44a5-b8f3-80b4b2210cd8\") " pod="openstack/placement-db-create-2nd8t" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.845611 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pbpq\" (UniqueName: \"kubernetes.io/projected/f710b218-cbee-44a5-b8f3-80b4b2210cd8-kube-api-access-5pbpq\") pod \"placement-db-create-2nd8t\" (UID: \"f710b218-cbee-44a5-b8f3-80b4b2210cd8\") " pod="openstack/placement-db-create-2nd8t" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.845847 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w4p7\" (UniqueName: \"kubernetes.io/projected/bf726ecc-d195-4c00-ad77-23f0875c6f2e-kube-api-access-7w4p7\") pod \"placement-8211-account-create-update-gpv69\" (UID: \"bf726ecc-d195-4c00-ad77-23f0875c6f2e\") " pod="openstack/placement-8211-account-create-update-gpv69" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.845909 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf726ecc-d195-4c00-ad77-23f0875c6f2e-operator-scripts\") pod \"placement-8211-account-create-update-gpv69\" (UID: \"bf726ecc-d195-4c00-ad77-23f0875c6f2e\") " pod="openstack/placement-8211-account-create-update-gpv69" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.846917 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f710b218-cbee-44a5-b8f3-80b4b2210cd8-operator-scripts\") pod \"placement-db-create-2nd8t\" (UID: \"f710b218-cbee-44a5-b8f3-80b4b2210cd8\") " pod="openstack/placement-db-create-2nd8t" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.864372 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pbpq\" (UniqueName: \"kubernetes.io/projected/f710b218-cbee-44a5-b8f3-80b4b2210cd8-kube-api-access-5pbpq\") pod \"placement-db-create-2nd8t\" (UID: \"f710b218-cbee-44a5-b8f3-80b4b2210cd8\") " pod="openstack/placement-db-create-2nd8t" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.948753 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w4p7\" (UniqueName: \"kubernetes.io/projected/bf726ecc-d195-4c00-ad77-23f0875c6f2e-kube-api-access-7w4p7\") pod \"placement-8211-account-create-update-gpv69\" (UID: \"bf726ecc-d195-4c00-ad77-23f0875c6f2e\") " pod="openstack/placement-8211-account-create-update-gpv69" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.948909 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf726ecc-d195-4c00-ad77-23f0875c6f2e-operator-scripts\") pod \"placement-8211-account-create-update-gpv69\" (UID: \"bf726ecc-d195-4c00-ad77-23f0875c6f2e\") " pod="openstack/placement-8211-account-create-update-gpv69" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.950537 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf726ecc-d195-4c00-ad77-23f0875c6f2e-operator-scripts\") pod \"placement-8211-account-create-update-gpv69\" (UID: \"bf726ecc-d195-4c00-ad77-23f0875c6f2e\") " pod="openstack/placement-8211-account-create-update-gpv69" Dec 02 09:42:43 crc kubenswrapper[4781]: I1202 09:42:43.972762 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w4p7\" (UniqueName: \"kubernetes.io/projected/bf726ecc-d195-4c00-ad77-23f0875c6f2e-kube-api-access-7w4p7\") pod \"placement-8211-account-create-update-gpv69\" (UID: \"bf726ecc-d195-4c00-ad77-23f0875c6f2e\") " pod="openstack/placement-8211-account-create-update-gpv69" Dec 02 09:42:44 crc kubenswrapper[4781]: I1202 09:42:44.062165 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2nd8t" Dec 02 09:42:44 crc kubenswrapper[4781]: I1202 09:42:44.183629 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8211-account-create-update-gpv69" Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.217118 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8528x"] Dec 02 09:42:45 crc kubenswrapper[4781]: W1202 09:42:45.281529 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43441d30_70a7_429c_a674_51db8c4e8111.slice/crio-077b8094108458157d871a2d6cf6a01246ee73966cd7bb2ee181a32e35cdc89e WatchSource:0}: Error finding container 077b8094108458157d871a2d6cf6a01246ee73966cd7bb2ee181a32e35cdc89e: Status 404 returned error can't find the container with id 077b8094108458157d871a2d6cf6a01246ee73966cd7bb2ee181a32e35cdc89e Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.358864 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b513-account-create-update-pn5wp"] Dec 02 09:42:45 crc kubenswrapper[4781]: W1202 09:42:45.364693 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4686994b_2137_4fbc_b470_32bfc5296f79.slice/crio-5849268f8b4ccad8155c4ddce0f5779d2b55ad9ca7dc1dd3cc7a570abc93638b WatchSource:0}: Error finding container 5849268f8b4ccad8155c4ddce0f5779d2b55ad9ca7dc1dd3cc7a570abc93638b: Status 404 returned error can't find the container with id 5849268f8b4ccad8155c4ddce0f5779d2b55ad9ca7dc1dd3cc7a570abc93638b Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.457849 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8211-account-create-update-gpv69"] Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.691864 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2nd8t"] Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.736326 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8211-account-create-update-gpv69" event={"ID":"bf726ecc-d195-4c00-ad77-23f0875c6f2e","Type":"ContainerStarted","Data":"b31064f3bf84a53d56ae42d54a19eb9d6bdef1b2f0be12a3cf24ad029d3971a5"} Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.742425 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8528x" event={"ID":"43441d30-70a7-429c-a674-51db8c4e8111","Type":"ContainerStarted","Data":"077b8094108458157d871a2d6cf6a01246ee73966cd7bb2ee181a32e35cdc89e"} Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.743726 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b513-account-create-update-pn5wp" event={"ID":"4686994b-2137-4fbc-b470-32bfc5296f79","Type":"ContainerStarted","Data":"40a9d5823d43b57ee3824f88c2f5cbadff07261c55ee9724a63a6c826baef21f"} Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.743751 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b513-account-create-update-pn5wp" event={"ID":"4686994b-2137-4fbc-b470-32bfc5296f79","Type":"ContainerStarted","Data":"5849268f8b4ccad8155c4ddce0f5779d2b55ad9ca7dc1dd3cc7a570abc93638b"} Dec 02 09:42:45 crc kubenswrapper[4781]: W1202 09:42:45.756139 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf710b218_cbee_44a5_b8f3_80b4b2210cd8.slice/crio-a045a5fe9ad48ad870c0cec46806e1f235a24c2c46548c5aec4b592608174e15 WatchSource:0}: Error finding container a045a5fe9ad48ad870c0cec46806e1f235a24c2c46548c5aec4b592608174e15: Status 404 returned error can't find the container with id a045a5fe9ad48ad870c0cec46806e1f235a24c2c46548c5aec4b592608174e15 Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.797206 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-b513-account-create-update-pn5wp" podStartSLOduration=2.797183445 podStartE2EDuration="2.797183445s" podCreationTimestamp="2025-12-02 09:42:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:42:45.789548688 +0000 UTC m=+1328.613422707" watchObservedRunningTime="2025-12-02 09:42:45.797183445 +0000 UTC m=+1328.621057324" Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.850073 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.851371 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-879pw"] Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.853570 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.893721 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-879pw\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.893862 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-config\") pod \"dnsmasq-dns-698758b865-879pw\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.893889 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsl76\" (UniqueName: \"kubernetes.io/projected/6efb5424-156a-4d85-865b-f057a1bdf098-kube-api-access-qsl76\") pod \"dnsmasq-dns-698758b865-879pw\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.893932 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-879pw\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.893973 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-dns-svc\") pod \"dnsmasq-dns-698758b865-879pw\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.899585 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-879pw"] Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.995950 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-config\") pod \"dnsmasq-dns-698758b865-879pw\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.996232 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsl76\" (UniqueName: \"kubernetes.io/projected/6efb5424-156a-4d85-865b-f057a1bdf098-kube-api-access-qsl76\") pod \"dnsmasq-dns-698758b865-879pw\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.996314 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-879pw\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.996404 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-dns-svc\") pod \"dnsmasq-dns-698758b865-879pw\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.996537 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-879pw\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.997571 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-879pw\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.997727 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-config\") pod \"dnsmasq-dns-698758b865-879pw\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.998469 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-879pw\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:45 crc kubenswrapper[4781]: I1202 09:42:45.998679 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-dns-svc\") pod \"dnsmasq-dns-698758b865-879pw\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:46 crc kubenswrapper[4781]: I1202 09:42:46.026569 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsl76\" (UniqueName: \"kubernetes.io/projected/6efb5424-156a-4d85-865b-f057a1bdf098-kube-api-access-qsl76\") pod \"dnsmasq-dns-698758b865-879pw\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:46 crc kubenswrapper[4781]: I1202 09:42:46.184402 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:46 crc kubenswrapper[4781]: E1202 09:42:46.604008 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4686994b_2137_4fbc_b470_32bfc5296f79.slice/crio-conmon-40a9d5823d43b57ee3824f88c2f5cbadff07261c55ee9724a63a6c826baef21f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf710b218_cbee_44a5_b8f3_80b4b2210cd8.slice/crio-de0d2ce2c109d3d641a30bcad85af2a13f3b19979f8ac1c0b0e6200e2cbcd6c6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43441d30_70a7_429c_a674_51db8c4e8111.slice/crio-3baa9c693a15be661f8c3833e3d3e240d8ea0df16b0941e949351c88151ebb43.scope\": RecentStats: unable to find data in memory cache]" Dec 02 09:42:46 crc kubenswrapper[4781]: I1202 09:42:46.725706 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-879pw"] Dec 02 09:42:46 crc kubenswrapper[4781]: I1202 09:42:46.752968 4781 generic.go:334] "Generic (PLEG): container finished" podID="bf726ecc-d195-4c00-ad77-23f0875c6f2e" containerID="e7a11ac23e9b0a8d1da3f8811f5b760fa5c3def063c432682a58c7a905f2911b" exitCode=0 Dec 02 09:42:46 crc kubenswrapper[4781]: I1202 09:42:46.753279 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8211-account-create-update-gpv69" event={"ID":"bf726ecc-d195-4c00-ad77-23f0875c6f2e","Type":"ContainerDied","Data":"e7a11ac23e9b0a8d1da3f8811f5b760fa5c3def063c432682a58c7a905f2911b"} Dec 02 09:42:46 crc kubenswrapper[4781]: I1202 09:42:46.758205 4781 generic.go:334] "Generic (PLEG): container finished" podID="f710b218-cbee-44a5-b8f3-80b4b2210cd8" containerID="de0d2ce2c109d3d641a30bcad85af2a13f3b19979f8ac1c0b0e6200e2cbcd6c6" exitCode=0 Dec 02 09:42:46 crc kubenswrapper[4781]: I1202 09:42:46.758270 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2nd8t" event={"ID":"f710b218-cbee-44a5-b8f3-80b4b2210cd8","Type":"ContainerDied","Data":"de0d2ce2c109d3d641a30bcad85af2a13f3b19979f8ac1c0b0e6200e2cbcd6c6"} Dec 02 09:42:46 crc kubenswrapper[4781]: I1202 09:42:46.758297 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2nd8t" event={"ID":"f710b218-cbee-44a5-b8f3-80b4b2210cd8","Type":"ContainerStarted","Data":"a045a5fe9ad48ad870c0cec46806e1f235a24c2c46548c5aec4b592608174e15"} Dec 02 09:42:46 crc kubenswrapper[4781]: I1202 09:42:46.760179 4781 generic.go:334] "Generic (PLEG): container finished" podID="43441d30-70a7-429c-a674-51db8c4e8111" containerID="3baa9c693a15be661f8c3833e3d3e240d8ea0df16b0941e949351c88151ebb43" exitCode=0 Dec 02 09:42:46 crc kubenswrapper[4781]: I1202 09:42:46.760412 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8528x" event={"ID":"43441d30-70a7-429c-a674-51db8c4e8111","Type":"ContainerDied","Data":"3baa9c693a15be661f8c3833e3d3e240d8ea0df16b0941e949351c88151ebb43"} Dec 02 09:42:46 crc kubenswrapper[4781]: I1202 09:42:46.761956 4781 generic.go:334] "Generic (PLEG): container finished" podID="4686994b-2137-4fbc-b470-32bfc5296f79" containerID="40a9d5823d43b57ee3824f88c2f5cbadff07261c55ee9724a63a6c826baef21f" exitCode=0 Dec 02 09:42:46 crc kubenswrapper[4781]: I1202 09:42:46.762012 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b513-account-create-update-pn5wp" event={"ID":"4686994b-2137-4fbc-b470-32bfc5296f79","Type":"ContainerDied","Data":"40a9d5823d43b57ee3824f88c2f5cbadff07261c55ee9724a63a6c826baef21f"} Dec 02 09:42:46 crc kubenswrapper[4781]: I1202 09:42:46.763019 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-879pw" event={"ID":"6efb5424-156a-4d85-865b-f057a1bdf098","Type":"ContainerStarted","Data":"3c4c5cfea9d4d1f6f7c5b7ab6b1f112791b5cffaddcd437c93fe69d2509c1dc0"} Dec 02 09:42:46 crc kubenswrapper[4781]: I1202 09:42:46.995316 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.006036 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.008558 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.010162 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-fmdnp" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.010200 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.010259 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.010311 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.121877 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.121945 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d8cf953d-c4c9-457e-956c-d2942b56499b-lock\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.122009 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d8cf953d-c4c9-457e-956c-d2942b56499b-cache\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.122031 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99krc\" (UniqueName: \"kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-kube-api-access-99krc\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.122051 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.223676 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.223720 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d8cf953d-c4c9-457e-956c-d2942b56499b-lock\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.223778 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d8cf953d-c4c9-457e-956c-d2942b56499b-cache\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.223802 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99krc\" (UniqueName: \"kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-kube-api-access-99krc\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.223826 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:47 crc kubenswrapper[4781]: E1202 09:42:47.224024 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 09:42:47 crc kubenswrapper[4781]: E1202 09:42:47.224038 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.224058 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Dec 02 09:42:47 crc kubenswrapper[4781]: E1202 09:42:47.224086 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift podName:d8cf953d-c4c9-457e-956c-d2942b56499b nodeName:}" failed. No retries permitted until 2025-12-02 09:42:47.724070066 +0000 UTC m=+1330.547943945 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift") pod "swift-storage-0" (UID: "d8cf953d-c4c9-457e-956c-d2942b56499b") : configmap "swift-ring-files" not found Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.224513 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d8cf953d-c4c9-457e-956c-d2942b56499b-lock\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.224592 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d8cf953d-c4c9-457e-956c-d2942b56499b-cache\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.243013 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99krc\" (UniqueName: \"kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-kube-api-access-99krc\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.246043 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.280235 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.549222 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-wxg6m"] Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.550559 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.552246 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.552426 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.552951 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.587955 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-wxg6m"] Dec 02 09:42:47 crc kubenswrapper[4781]: E1202 09:42:47.588704 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-tj6jn ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-tj6jn ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-wxg6m" podUID="21710ce9-b3c8-4530-a151-07f9a9e31676" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.601851 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-p9svs"] Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.602988 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.609413 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-p9svs"] Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.619148 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-wxg6m"] Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.738845 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d15a32bb-043f-4019-8ddb-4bcc54b243a0-scripts\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.738888 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-swiftconf\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.738915 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-swiftconf\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.738971 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj6jn\" (UniqueName: \"kubernetes.io/projected/21710ce9-b3c8-4530-a151-07f9a9e31676-kube-api-access-tj6jn\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.738993 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-dispersionconf\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.739146 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d15a32bb-043f-4019-8ddb-4bcc54b243a0-ring-data-devices\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.739254 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt87m\" (UniqueName: \"kubernetes.io/projected/d15a32bb-043f-4019-8ddb-4bcc54b243a0-kube-api-access-tt87m\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.739405 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-combined-ca-bundle\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.739429 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21710ce9-b3c8-4530-a151-07f9a9e31676-scripts\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.739465 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:47 crc kubenswrapper[4781]: E1202 09:42:47.739961 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.739976 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21710ce9-b3c8-4530-a151-07f9a9e31676-etc-swift\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.740017 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-dispersionconf\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.740048 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21710ce9-b3c8-4530-a151-07f9a9e31676-ring-data-devices\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: E1202 09:42:47.739982 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.740097 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-combined-ca-bundle\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: E1202 09:42:47.740106 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift podName:d8cf953d-c4c9-457e-956c-d2942b56499b nodeName:}" failed. No retries permitted until 2025-12-02 09:42:48.74008805 +0000 UTC m=+1331.563961929 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift") pod "swift-storage-0" (UID: "d8cf953d-c4c9-457e-956c-d2942b56499b") : configmap "swift-ring-files" not found Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.740168 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d15a32bb-043f-4019-8ddb-4bcc54b243a0-etc-swift\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.770917 4781 generic.go:334] "Generic (PLEG): container finished" podID="6efb5424-156a-4d85-865b-f057a1bdf098" containerID="8e409c4c46bd9771662c67f2104628cb7084c94e502d3be7ccde3886234402e4" exitCode=0 Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.770967 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-879pw" event={"ID":"6efb5424-156a-4d85-865b-f057a1bdf098","Type":"ContainerDied","Data":"8e409c4c46bd9771662c67f2104628cb7084c94e502d3be7ccde3886234402e4"} Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.771027 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.861008 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-combined-ca-bundle\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.861112 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d15a32bb-043f-4019-8ddb-4bcc54b243a0-etc-swift\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.861212 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d15a32bb-043f-4019-8ddb-4bcc54b243a0-scripts\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.861243 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-swiftconf\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.861286 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-swiftconf\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.861384 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj6jn\" (UniqueName: \"kubernetes.io/projected/21710ce9-b3c8-4530-a151-07f9a9e31676-kube-api-access-tj6jn\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.861421 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-dispersionconf\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.861461 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d15a32bb-043f-4019-8ddb-4bcc54b243a0-ring-data-devices\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.861496 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt87m\" (UniqueName: \"kubernetes.io/projected/d15a32bb-043f-4019-8ddb-4bcc54b243a0-kube-api-access-tt87m\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.861877 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-combined-ca-bundle\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.861919 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21710ce9-b3c8-4530-a151-07f9a9e31676-scripts\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.862007 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21710ce9-b3c8-4530-a151-07f9a9e31676-etc-swift\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.862052 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-dispersionconf\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.862093 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21710ce9-b3c8-4530-a151-07f9a9e31676-ring-data-devices\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.862569 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d15a32bb-043f-4019-8ddb-4bcc54b243a0-etc-swift\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.863741 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d15a32bb-043f-4019-8ddb-4bcc54b243a0-scripts\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.864072 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21710ce9-b3c8-4530-a151-07f9a9e31676-etc-swift\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.864086 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d15a32bb-043f-4019-8ddb-4bcc54b243a0-ring-data-devices\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.864138 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21710ce9-b3c8-4530-a151-07f9a9e31676-scripts\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.867051 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21710ce9-b3c8-4530-a151-07f9a9e31676-ring-data-devices\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.869482 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-combined-ca-bundle\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.869818 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-dispersionconf\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.881643 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-dispersionconf\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.881808 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-swiftconf\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.881841 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj6jn\" (UniqueName: \"kubernetes.io/projected/21710ce9-b3c8-4530-a151-07f9a9e31676-kube-api-access-tj6jn\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.882334 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt87m\" (UniqueName: \"kubernetes.io/projected/d15a32bb-043f-4019-8ddb-4bcc54b243a0-kube-api-access-tt87m\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.885408 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-combined-ca-bundle\") pod \"swift-ring-rebalance-wxg6m\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.891645 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-swiftconf\") pod \"swift-ring-rebalance-p9svs\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.919537 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:42:47 crc kubenswrapper[4781]: I1202 09:42:47.956937 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.065268 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-swiftconf\") pod \"21710ce9-b3c8-4530-a151-07f9a9e31676\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.065918 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj6jn\" (UniqueName: \"kubernetes.io/projected/21710ce9-b3c8-4530-a151-07f9a9e31676-kube-api-access-tj6jn\") pod \"21710ce9-b3c8-4530-a151-07f9a9e31676\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.066116 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-combined-ca-bundle\") pod \"21710ce9-b3c8-4530-a151-07f9a9e31676\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.066299 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21710ce9-b3c8-4530-a151-07f9a9e31676-ring-data-devices\") pod \"21710ce9-b3c8-4530-a151-07f9a9e31676\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.066470 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21710ce9-b3c8-4530-a151-07f9a9e31676-etc-swift\") pod \"21710ce9-b3c8-4530-a151-07f9a9e31676\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.066596 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21710ce9-b3c8-4530-a151-07f9a9e31676-scripts\") pod \"21710ce9-b3c8-4530-a151-07f9a9e31676\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.066702 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-dispersionconf\") pod \"21710ce9-b3c8-4530-a151-07f9a9e31676\" (UID: \"21710ce9-b3c8-4530-a151-07f9a9e31676\") " Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.071160 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21710ce9-b3c8-4530-a151-07f9a9e31676-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "21710ce9-b3c8-4530-a151-07f9a9e31676" (UID: "21710ce9-b3c8-4530-a151-07f9a9e31676"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.071636 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21710ce9-b3c8-4530-a151-07f9a9e31676-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "21710ce9-b3c8-4530-a151-07f9a9e31676" (UID: "21710ce9-b3c8-4530-a151-07f9a9e31676"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.072340 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21710ce9-b3c8-4530-a151-07f9a9e31676-scripts" (OuterVolumeSpecName: "scripts") pod "21710ce9-b3c8-4530-a151-07f9a9e31676" (UID: "21710ce9-b3c8-4530-a151-07f9a9e31676"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.074693 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21710ce9-b3c8-4530-a151-07f9a9e31676" (UID: "21710ce9-b3c8-4530-a151-07f9a9e31676"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.076108 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "21710ce9-b3c8-4530-a151-07f9a9e31676" (UID: "21710ce9-b3c8-4530-a151-07f9a9e31676"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.104702 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "21710ce9-b3c8-4530-a151-07f9a9e31676" (UID: "21710ce9-b3c8-4530-a151-07f9a9e31676"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.106195 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21710ce9-b3c8-4530-a151-07f9a9e31676-kube-api-access-tj6jn" (OuterVolumeSpecName: "kube-api-access-tj6jn") pod "21710ce9-b3c8-4530-a151-07f9a9e31676" (UID: "21710ce9-b3c8-4530-a151-07f9a9e31676"). InnerVolumeSpecName "kube-api-access-tj6jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.129094 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b513-account-create-update-pn5wp" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.168479 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tc9x\" (UniqueName: \"kubernetes.io/projected/4686994b-2137-4fbc-b470-32bfc5296f79-kube-api-access-8tc9x\") pod \"4686994b-2137-4fbc-b470-32bfc5296f79\" (UID: \"4686994b-2137-4fbc-b470-32bfc5296f79\") " Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.168643 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4686994b-2137-4fbc-b470-32bfc5296f79-operator-scripts\") pod \"4686994b-2137-4fbc-b470-32bfc5296f79\" (UID: \"4686994b-2137-4fbc-b470-32bfc5296f79\") " Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.169020 4781 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21710ce9-b3c8-4530-a151-07f9a9e31676-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.169033 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21710ce9-b3c8-4530-a151-07f9a9e31676-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.169041 4781 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.169052 4781 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.169059 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj6jn\" (UniqueName: \"kubernetes.io/projected/21710ce9-b3c8-4530-a151-07f9a9e31676-kube-api-access-tj6jn\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.169068 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21710ce9-b3c8-4530-a151-07f9a9e31676-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.169076 4781 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21710ce9-b3c8-4530-a151-07f9a9e31676-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.169172 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4686994b-2137-4fbc-b470-32bfc5296f79-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4686994b-2137-4fbc-b470-32bfc5296f79" (UID: "4686994b-2137-4fbc-b470-32bfc5296f79"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.180104 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4686994b-2137-4fbc-b470-32bfc5296f79-kube-api-access-8tc9x" (OuterVolumeSpecName: "kube-api-access-8tc9x") pod "4686994b-2137-4fbc-b470-32bfc5296f79" (UID: "4686994b-2137-4fbc-b470-32bfc5296f79"). InnerVolumeSpecName "kube-api-access-8tc9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.273880 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tc9x\" (UniqueName: \"kubernetes.io/projected/4686994b-2137-4fbc-b470-32bfc5296f79-kube-api-access-8tc9x\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.273908 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4686994b-2137-4fbc-b470-32bfc5296f79-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.331748 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8211-account-create-update-gpv69" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.356857 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2nd8t" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.375908 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf726ecc-d195-4c00-ad77-23f0875c6f2e-operator-scripts\") pod \"bf726ecc-d195-4c00-ad77-23f0875c6f2e\" (UID: \"bf726ecc-d195-4c00-ad77-23f0875c6f2e\") " Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.376043 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w4p7\" (UniqueName: \"kubernetes.io/projected/bf726ecc-d195-4c00-ad77-23f0875c6f2e-kube-api-access-7w4p7\") pod \"bf726ecc-d195-4c00-ad77-23f0875c6f2e\" (UID: \"bf726ecc-d195-4c00-ad77-23f0875c6f2e\") " Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.376989 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf726ecc-d195-4c00-ad77-23f0875c6f2e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf726ecc-d195-4c00-ad77-23f0875c6f2e" (UID: "bf726ecc-d195-4c00-ad77-23f0875c6f2e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.381721 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf726ecc-d195-4c00-ad77-23f0875c6f2e-kube-api-access-7w4p7" (OuterVolumeSpecName: "kube-api-access-7w4p7") pod "bf726ecc-d195-4c00-ad77-23f0875c6f2e" (UID: "bf726ecc-d195-4c00-ad77-23f0875c6f2e"). InnerVolumeSpecName "kube-api-access-7w4p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.401568 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8528x" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.477352 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbqzp\" (UniqueName: \"kubernetes.io/projected/43441d30-70a7-429c-a674-51db8c4e8111-kube-api-access-xbqzp\") pod \"43441d30-70a7-429c-a674-51db8c4e8111\" (UID: \"43441d30-70a7-429c-a674-51db8c4e8111\") " Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.477499 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43441d30-70a7-429c-a674-51db8c4e8111-operator-scripts\") pod \"43441d30-70a7-429c-a674-51db8c4e8111\" (UID: \"43441d30-70a7-429c-a674-51db8c4e8111\") " Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.477580 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f710b218-cbee-44a5-b8f3-80b4b2210cd8-operator-scripts\") pod \"f710b218-cbee-44a5-b8f3-80b4b2210cd8\" (UID: \"f710b218-cbee-44a5-b8f3-80b4b2210cd8\") " Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.477690 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pbpq\" (UniqueName: \"kubernetes.io/projected/f710b218-cbee-44a5-b8f3-80b4b2210cd8-kube-api-access-5pbpq\") pod \"f710b218-cbee-44a5-b8f3-80b4b2210cd8\" (UID: \"f710b218-cbee-44a5-b8f3-80b4b2210cd8\") " Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.477865 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43441d30-70a7-429c-a674-51db8c4e8111-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43441d30-70a7-429c-a674-51db8c4e8111" (UID: "43441d30-70a7-429c-a674-51db8c4e8111"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.478292 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f710b218-cbee-44a5-b8f3-80b4b2210cd8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f710b218-cbee-44a5-b8f3-80b4b2210cd8" (UID: "f710b218-cbee-44a5-b8f3-80b4b2210cd8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.478379 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf726ecc-d195-4c00-ad77-23f0875c6f2e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.478411 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43441d30-70a7-429c-a674-51db8c4e8111-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.478432 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w4p7\" (UniqueName: \"kubernetes.io/projected/bf726ecc-d195-4c00-ad77-23f0875c6f2e-kube-api-access-7w4p7\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.481712 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43441d30-70a7-429c-a674-51db8c4e8111-kube-api-access-xbqzp" (OuterVolumeSpecName: "kube-api-access-xbqzp") pod "43441d30-70a7-429c-a674-51db8c4e8111" (UID: "43441d30-70a7-429c-a674-51db8c4e8111"). InnerVolumeSpecName "kube-api-access-xbqzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.484109 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f710b218-cbee-44a5-b8f3-80b4b2210cd8-kube-api-access-5pbpq" (OuterVolumeSpecName: "kube-api-access-5pbpq") pod "f710b218-cbee-44a5-b8f3-80b4b2210cd8" (UID: "f710b218-cbee-44a5-b8f3-80b4b2210cd8"). InnerVolumeSpecName "kube-api-access-5pbpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.546835 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-p9svs"] Dec 02 09:42:48 crc kubenswrapper[4781]: W1202 09:42:48.550181 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd15a32bb_043f_4019_8ddb_4bcc54b243a0.slice/crio-4cb949438d24a9a92e53118103f1c94da6daf9d2a8f86ec45736a385b501102e WatchSource:0}: Error finding container 4cb949438d24a9a92e53118103f1c94da6daf9d2a8f86ec45736a385b501102e: Status 404 returned error can't find the container with id 4cb949438d24a9a92e53118103f1c94da6daf9d2a8f86ec45736a385b501102e Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.580500 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbqzp\" (UniqueName: \"kubernetes.io/projected/43441d30-70a7-429c-a674-51db8c4e8111-kube-api-access-xbqzp\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.580545 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f710b218-cbee-44a5-b8f3-80b4b2210cd8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.580555 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pbpq\" (UniqueName: \"kubernetes.io/projected/f710b218-cbee-44a5-b8f3-80b4b2210cd8-kube-api-access-5pbpq\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.779767 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8211-account-create-update-gpv69" event={"ID":"bf726ecc-d195-4c00-ad77-23f0875c6f2e","Type":"ContainerDied","Data":"b31064f3bf84a53d56ae42d54a19eb9d6bdef1b2f0be12a3cf24ad029d3971a5"} Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.779811 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b31064f3bf84a53d56ae42d54a19eb9d6bdef1b2f0be12a3cf24ad029d3971a5" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.779883 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8211-account-create-update-gpv69" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.783039 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:48 crc kubenswrapper[4781]: E1202 09:42:48.783219 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 09:42:48 crc kubenswrapper[4781]: E1202 09:42:48.783237 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.783228 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2nd8t" event={"ID":"f710b218-cbee-44a5-b8f3-80b4b2210cd8","Type":"ContainerDied","Data":"a045a5fe9ad48ad870c0cec46806e1f235a24c2c46548c5aec4b592608174e15"} Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.783270 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a045a5fe9ad48ad870c0cec46806e1f235a24c2c46548c5aec4b592608174e15" Dec 02 09:42:48 crc kubenswrapper[4781]: E1202 09:42:48.783276 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift podName:d8cf953d-c4c9-457e-956c-d2942b56499b nodeName:}" failed. No retries permitted until 2025-12-02 09:42:50.78326309 +0000 UTC m=+1333.607136969 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift") pod "swift-storage-0" (UID: "d8cf953d-c4c9-457e-956c-d2942b56499b") : configmap "swift-ring-files" not found Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.783335 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2nd8t" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.802684 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-p9svs" event={"ID":"d15a32bb-043f-4019-8ddb-4bcc54b243a0","Type":"ContainerStarted","Data":"4cb949438d24a9a92e53118103f1c94da6daf9d2a8f86ec45736a385b501102e"} Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.805069 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8528x" event={"ID":"43441d30-70a7-429c-a674-51db8c4e8111","Type":"ContainerDied","Data":"077b8094108458157d871a2d6cf6a01246ee73966cd7bb2ee181a32e35cdc89e"} Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.805092 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="077b8094108458157d871a2d6cf6a01246ee73966cd7bb2ee181a32e35cdc89e" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.805144 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8528x" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.813478 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b513-account-create-update-pn5wp" event={"ID":"4686994b-2137-4fbc-b470-32bfc5296f79","Type":"ContainerDied","Data":"5849268f8b4ccad8155c4ddce0f5779d2b55ad9ca7dc1dd3cc7a570abc93638b"} Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.813511 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5849268f8b4ccad8155c4ddce0f5779d2b55ad9ca7dc1dd3cc7a570abc93638b" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.813561 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b513-account-create-update-pn5wp" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.820458 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wxg6m" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.820486 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-879pw" event={"ID":"6efb5424-156a-4d85-865b-f057a1bdf098","Type":"ContainerStarted","Data":"10dae2396adaf19a5e67a1d6d7232adc47b131f7ec741b75de3e2361e60012ec"} Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.820567 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.844454 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-879pw" podStartSLOduration=3.844431802 podStartE2EDuration="3.844431802s" podCreationTimestamp="2025-12-02 09:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:42:48.838218034 +0000 UTC m=+1331.662091913" watchObservedRunningTime="2025-12-02 09:42:48.844431802 +0000 UTC m=+1331.668305681" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.875956 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-wxg6m"] Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.880741 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-wxg6m"] Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.994611 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-s6nvd"] Dec 02 09:42:48 crc kubenswrapper[4781]: E1202 09:42:48.994958 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf726ecc-d195-4c00-ad77-23f0875c6f2e" containerName="mariadb-account-create-update" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.994973 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf726ecc-d195-4c00-ad77-23f0875c6f2e" containerName="mariadb-account-create-update" Dec 02 09:42:48 crc kubenswrapper[4781]: E1202 09:42:48.994987 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f710b218-cbee-44a5-b8f3-80b4b2210cd8" containerName="mariadb-database-create" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.994993 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f710b218-cbee-44a5-b8f3-80b4b2210cd8" containerName="mariadb-database-create" Dec 02 09:42:48 crc kubenswrapper[4781]: E1202 09:42:48.995003 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4686994b-2137-4fbc-b470-32bfc5296f79" containerName="mariadb-account-create-update" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.995009 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4686994b-2137-4fbc-b470-32bfc5296f79" containerName="mariadb-account-create-update" Dec 02 09:42:48 crc kubenswrapper[4781]: E1202 09:42:48.995027 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43441d30-70a7-429c-a674-51db8c4e8111" containerName="mariadb-database-create" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.995033 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="43441d30-70a7-429c-a674-51db8c4e8111" containerName="mariadb-database-create" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.995174 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4686994b-2137-4fbc-b470-32bfc5296f79" containerName="mariadb-account-create-update" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.995183 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f710b218-cbee-44a5-b8f3-80b4b2210cd8" containerName="mariadb-database-create" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.995198 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="43441d30-70a7-429c-a674-51db8c4e8111" containerName="mariadb-database-create" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.995209 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf726ecc-d195-4c00-ad77-23f0875c6f2e" containerName="mariadb-account-create-update" Dec 02 09:42:48 crc kubenswrapper[4781]: I1202 09:42:48.995752 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s6nvd" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.012855 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-s6nvd"] Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.082689 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-0ff3-account-create-update-4ncb9"] Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.084191 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0ff3-account-create-update-4ncb9" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.086354 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.088492 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtclm\" (UniqueName: \"kubernetes.io/projected/6ae7a228-a973-4207-9556-debe4ae4eb2a-kube-api-access-gtclm\") pod \"glance-db-create-s6nvd\" (UID: \"6ae7a228-a973-4207-9556-debe4ae4eb2a\") " pod="openstack/glance-db-create-s6nvd" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.088644 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ae7a228-a973-4207-9556-debe4ae4eb2a-operator-scripts\") pod \"glance-db-create-s6nvd\" (UID: \"6ae7a228-a973-4207-9556-debe4ae4eb2a\") " pod="openstack/glance-db-create-s6nvd" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.095719 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0ff3-account-create-update-4ncb9"] Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.189819 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtclm\" (UniqueName: \"kubernetes.io/projected/6ae7a228-a973-4207-9556-debe4ae4eb2a-kube-api-access-gtclm\") pod \"glance-db-create-s6nvd\" (UID: \"6ae7a228-a973-4207-9556-debe4ae4eb2a\") " pod="openstack/glance-db-create-s6nvd" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.189881 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87wv8\" (UniqueName: \"kubernetes.io/projected/17abf61f-c972-4ab9-87db-2083b58f69fa-kube-api-access-87wv8\") pod \"glance-0ff3-account-create-update-4ncb9\" (UID: \"17abf61f-c972-4ab9-87db-2083b58f69fa\") " pod="openstack/glance-0ff3-account-create-update-4ncb9" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.189957 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ae7a228-a973-4207-9556-debe4ae4eb2a-operator-scripts\") pod \"glance-db-create-s6nvd\" (UID: \"6ae7a228-a973-4207-9556-debe4ae4eb2a\") " pod="openstack/glance-db-create-s6nvd" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.190003 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17abf61f-c972-4ab9-87db-2083b58f69fa-operator-scripts\") pod \"glance-0ff3-account-create-update-4ncb9\" (UID: \"17abf61f-c972-4ab9-87db-2083b58f69fa\") " pod="openstack/glance-0ff3-account-create-update-4ncb9" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.190769 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ae7a228-a973-4207-9556-debe4ae4eb2a-operator-scripts\") pod \"glance-db-create-s6nvd\" (UID: \"6ae7a228-a973-4207-9556-debe4ae4eb2a\") " pod="openstack/glance-db-create-s6nvd" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.207456 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtclm\" (UniqueName: \"kubernetes.io/projected/6ae7a228-a973-4207-9556-debe4ae4eb2a-kube-api-access-gtclm\") pod \"glance-db-create-s6nvd\" (UID: \"6ae7a228-a973-4207-9556-debe4ae4eb2a\") " pod="openstack/glance-db-create-s6nvd" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.291261 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87wv8\" (UniqueName: \"kubernetes.io/projected/17abf61f-c972-4ab9-87db-2083b58f69fa-kube-api-access-87wv8\") pod \"glance-0ff3-account-create-update-4ncb9\" (UID: \"17abf61f-c972-4ab9-87db-2083b58f69fa\") " pod="openstack/glance-0ff3-account-create-update-4ncb9" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.291372 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17abf61f-c972-4ab9-87db-2083b58f69fa-operator-scripts\") pod \"glance-0ff3-account-create-update-4ncb9\" (UID: \"17abf61f-c972-4ab9-87db-2083b58f69fa\") " pod="openstack/glance-0ff3-account-create-update-4ncb9" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.293385 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17abf61f-c972-4ab9-87db-2083b58f69fa-operator-scripts\") pod \"glance-0ff3-account-create-update-4ncb9\" (UID: \"17abf61f-c972-4ab9-87db-2083b58f69fa\") " pod="openstack/glance-0ff3-account-create-update-4ncb9" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.308577 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87wv8\" (UniqueName: \"kubernetes.io/projected/17abf61f-c972-4ab9-87db-2083b58f69fa-kube-api-access-87wv8\") pod \"glance-0ff3-account-create-update-4ncb9\" (UID: \"17abf61f-c972-4ab9-87db-2083b58f69fa\") " pod="openstack/glance-0ff3-account-create-update-4ncb9" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.315556 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s6nvd" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.397538 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0ff3-account-create-update-4ncb9" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.528492 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21710ce9-b3c8-4530-a151-07f9a9e31676" path="/var/lib/kubelet/pods/21710ce9-b3c8-4530-a151-07f9a9e31676/volumes" Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.828481 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-s6nvd"] Dec 02 09:42:49 crc kubenswrapper[4781]: I1202 09:42:49.915606 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0ff3-account-create-update-4ncb9"] Dec 02 09:42:49 crc kubenswrapper[4781]: W1202 09:42:49.923853 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17abf61f_c972_4ab9_87db_2083b58f69fa.slice/crio-c216be88cf301591ed95430ea1a8e105a78962818d142c9b1828509723ded0fc WatchSource:0}: Error finding container c216be88cf301591ed95430ea1a8e105a78962818d142c9b1828509723ded0fc: Status 404 returned error can't find the container with id c216be88cf301591ed95430ea1a8e105a78962818d142c9b1828509723ded0fc Dec 02 09:42:50 crc kubenswrapper[4781]: I1202 09:42:50.816686 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:50 crc kubenswrapper[4781]: E1202 09:42:50.816903 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 09:42:50 crc kubenswrapper[4781]: E1202 09:42:50.817288 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 09:42:50 crc kubenswrapper[4781]: E1202 09:42:50.817348 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift podName:d8cf953d-c4c9-457e-956c-d2942b56499b nodeName:}" failed. No retries permitted until 2025-12-02 09:42:54.817330167 +0000 UTC m=+1337.641204046 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift") pod "swift-storage-0" (UID: "d8cf953d-c4c9-457e-956c-d2942b56499b") : configmap "swift-ring-files" not found Dec 02 09:42:50 crc kubenswrapper[4781]: I1202 09:42:50.855271 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-s6nvd" event={"ID":"6ae7a228-a973-4207-9556-debe4ae4eb2a","Type":"ContainerStarted","Data":"83b98bad06a8544fdedb2b6c741c5a2e67611df56bd1ddea50be40fbf9211644"} Dec 02 09:42:50 crc kubenswrapper[4781]: I1202 09:42:50.855315 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-s6nvd" event={"ID":"6ae7a228-a973-4207-9556-debe4ae4eb2a","Type":"ContainerStarted","Data":"f42b818ea3f74bb88a4a42268cb44311543e33d07b75bd6f63697d998827a4d1"} Dec 02 09:42:50 crc kubenswrapper[4781]: I1202 09:42:50.857666 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0ff3-account-create-update-4ncb9" event={"ID":"17abf61f-c972-4ab9-87db-2083b58f69fa","Type":"ContainerStarted","Data":"88094bbbfde71ba9ceaedf7924eccbdc0912eebcc7f6266f3042f790af408615"} Dec 02 09:42:50 crc kubenswrapper[4781]: I1202 09:42:50.857703 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0ff3-account-create-update-4ncb9" event={"ID":"17abf61f-c972-4ab9-87db-2083b58f69fa","Type":"ContainerStarted","Data":"c216be88cf301591ed95430ea1a8e105a78962818d142c9b1828509723ded0fc"} Dec 02 09:42:50 crc kubenswrapper[4781]: I1202 09:42:50.870802 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-s6nvd" podStartSLOduration=2.87077722 podStartE2EDuration="2.87077722s" podCreationTimestamp="2025-12-02 09:42:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:42:50.86850911 +0000 UTC m=+1333.692382989" watchObservedRunningTime="2025-12-02 09:42:50.87077722 +0000 UTC m=+1333.694651119" Dec 02 09:42:50 crc kubenswrapper[4781]: I1202 09:42:50.887829 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-0ff3-account-create-update-4ncb9" podStartSLOduration=1.88781023 podStartE2EDuration="1.88781023s" podCreationTimestamp="2025-12-02 09:42:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:42:50.887300727 +0000 UTC m=+1333.711174606" watchObservedRunningTime="2025-12-02 09:42:50.88781023 +0000 UTC m=+1333.711684109" Dec 02 09:42:51 crc kubenswrapper[4781]: I1202 09:42:51.865560 4781 generic.go:334] "Generic (PLEG): container finished" podID="17abf61f-c972-4ab9-87db-2083b58f69fa" containerID="88094bbbfde71ba9ceaedf7924eccbdc0912eebcc7f6266f3042f790af408615" exitCode=0 Dec 02 09:42:51 crc kubenswrapper[4781]: I1202 09:42:51.865602 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0ff3-account-create-update-4ncb9" event={"ID":"17abf61f-c972-4ab9-87db-2083b58f69fa","Type":"ContainerDied","Data":"88094bbbfde71ba9ceaedf7924eccbdc0912eebcc7f6266f3042f790af408615"} Dec 02 09:42:51 crc kubenswrapper[4781]: I1202 09:42:51.868910 4781 generic.go:334] "Generic (PLEG): container finished" podID="6ae7a228-a973-4207-9556-debe4ae4eb2a" containerID="83b98bad06a8544fdedb2b6c741c5a2e67611df56bd1ddea50be40fbf9211644" exitCode=0 Dec 02 09:42:51 crc kubenswrapper[4781]: I1202 09:42:51.868956 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-s6nvd" event={"ID":"6ae7a228-a973-4207-9556-debe4ae4eb2a","Type":"ContainerDied","Data":"83b98bad06a8544fdedb2b6c741c5a2e67611df56bd1ddea50be40fbf9211644"} Dec 02 09:42:52 crc kubenswrapper[4781]: I1202 09:42:52.881595 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-p9svs" event={"ID":"d15a32bb-043f-4019-8ddb-4bcc54b243a0","Type":"ContainerStarted","Data":"16bc06b62e8ef9a8685661b61fe145273fa47eb2e965a6a47a64679324ce503c"} Dec 02 09:42:52 crc kubenswrapper[4781]: I1202 09:42:52.897562 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-p9svs" podStartSLOduration=2.144169506 podStartE2EDuration="5.897542861s" podCreationTimestamp="2025-12-02 09:42:47 +0000 UTC" firstStartedPulling="2025-12-02 09:42:48.552150889 +0000 UTC m=+1331.376024768" lastFinishedPulling="2025-12-02 09:42:52.305524254 +0000 UTC m=+1335.129398123" observedRunningTime="2025-12-02 09:42:52.896790461 +0000 UTC m=+1335.720664360" watchObservedRunningTime="2025-12-02 09:42:52.897542861 +0000 UTC m=+1335.721416730" Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.324215 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s6nvd" Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.330001 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0ff3-account-create-update-4ncb9" Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.455161 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87wv8\" (UniqueName: \"kubernetes.io/projected/17abf61f-c972-4ab9-87db-2083b58f69fa-kube-api-access-87wv8\") pod \"17abf61f-c972-4ab9-87db-2083b58f69fa\" (UID: \"17abf61f-c972-4ab9-87db-2083b58f69fa\") " Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.455243 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ae7a228-a973-4207-9556-debe4ae4eb2a-operator-scripts\") pod \"6ae7a228-a973-4207-9556-debe4ae4eb2a\" (UID: \"6ae7a228-a973-4207-9556-debe4ae4eb2a\") " Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.455309 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17abf61f-c972-4ab9-87db-2083b58f69fa-operator-scripts\") pod \"17abf61f-c972-4ab9-87db-2083b58f69fa\" (UID: \"17abf61f-c972-4ab9-87db-2083b58f69fa\") " Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.455434 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtclm\" (UniqueName: \"kubernetes.io/projected/6ae7a228-a973-4207-9556-debe4ae4eb2a-kube-api-access-gtclm\") pod \"6ae7a228-a973-4207-9556-debe4ae4eb2a\" (UID: \"6ae7a228-a973-4207-9556-debe4ae4eb2a\") " Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.455987 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ae7a228-a973-4207-9556-debe4ae4eb2a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ae7a228-a973-4207-9556-debe4ae4eb2a" (UID: "6ae7a228-a973-4207-9556-debe4ae4eb2a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.456242 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17abf61f-c972-4ab9-87db-2083b58f69fa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "17abf61f-c972-4ab9-87db-2083b58f69fa" (UID: "17abf61f-c972-4ab9-87db-2083b58f69fa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.456485 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ae7a228-a973-4207-9556-debe4ae4eb2a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.456507 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17abf61f-c972-4ab9-87db-2083b58f69fa-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.460832 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17abf61f-c972-4ab9-87db-2083b58f69fa-kube-api-access-87wv8" (OuterVolumeSpecName: "kube-api-access-87wv8") pod "17abf61f-c972-4ab9-87db-2083b58f69fa" (UID: "17abf61f-c972-4ab9-87db-2083b58f69fa"). InnerVolumeSpecName "kube-api-access-87wv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.463123 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ae7a228-a973-4207-9556-debe4ae4eb2a-kube-api-access-gtclm" (OuterVolumeSpecName: "kube-api-access-gtclm") pod "6ae7a228-a973-4207-9556-debe4ae4eb2a" (UID: "6ae7a228-a973-4207-9556-debe4ae4eb2a"). InnerVolumeSpecName "kube-api-access-gtclm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.558208 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtclm\" (UniqueName: \"kubernetes.io/projected/6ae7a228-a973-4207-9556-debe4ae4eb2a-kube-api-access-gtclm\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.558474 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87wv8\" (UniqueName: \"kubernetes.io/projected/17abf61f-c972-4ab9-87db-2083b58f69fa-kube-api-access-87wv8\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.892454 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0ff3-account-create-update-4ncb9" event={"ID":"17abf61f-c972-4ab9-87db-2083b58f69fa","Type":"ContainerDied","Data":"c216be88cf301591ed95430ea1a8e105a78962818d142c9b1828509723ded0fc"} Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.892493 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c216be88cf301591ed95430ea1a8e105a78962818d142c9b1828509723ded0fc" Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.892542 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0ff3-account-create-update-4ncb9" Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.895239 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-s6nvd" event={"ID":"6ae7a228-a973-4207-9556-debe4ae4eb2a","Type":"ContainerDied","Data":"f42b818ea3f74bb88a4a42268cb44311543e33d07b75bd6f63697d998827a4d1"} Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.895287 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f42b818ea3f74bb88a4a42268cb44311543e33d07b75bd6f63697d998827a4d1" Dec 02 09:42:53 crc kubenswrapper[4781]: I1202 09:42:53.895344 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s6nvd" Dec 02 09:42:54 crc kubenswrapper[4781]: I1202 09:42:54.875943 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:42:54 crc kubenswrapper[4781]: E1202 09:42:54.876145 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 09:42:54 crc kubenswrapper[4781]: E1202 09:42:54.876163 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 09:42:54 crc kubenswrapper[4781]: E1202 09:42:54.876212 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift podName:d8cf953d-c4c9-457e-956c-d2942b56499b nodeName:}" failed. No retries permitted until 2025-12-02 09:43:02.876194912 +0000 UTC m=+1345.700068801 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift") pod "swift-storage-0" (UID: "d8cf953d-c4c9-457e-956c-d2942b56499b") : configmap "swift-ring-files" not found Dec 02 09:42:56 crc kubenswrapper[4781]: I1202 09:42:56.186064 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:42:56 crc kubenswrapper[4781]: I1202 09:42:56.233882 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zc9s5"] Dec 02 09:42:56 crc kubenswrapper[4781]: I1202 09:42:56.234151 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" podUID="244b353d-3185-4eda-9d02-c22719d2a514" containerName="dnsmasq-dns" containerID="cri-o://c08dfa216a14f1b5f92faf7d6e1d93ef4a950f425199cd650ab41ec13cb5de98" gracePeriod=10 Dec 02 09:42:56 crc kubenswrapper[4781]: I1202 09:42:56.921896 4781 generic.go:334] "Generic (PLEG): container finished" podID="244b353d-3185-4eda-9d02-c22719d2a514" containerID="c08dfa216a14f1b5f92faf7d6e1d93ef4a950f425199cd650ab41ec13cb5de98" exitCode=0 Dec 02 09:42:56 crc kubenswrapper[4781]: I1202 09:42:56.922568 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" event={"ID":"244b353d-3185-4eda-9d02-c22719d2a514","Type":"ContainerDied","Data":"c08dfa216a14f1b5f92faf7d6e1d93ef4a950f425199cd650ab41ec13cb5de98"} Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.260474 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.314663 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75jdh\" (UniqueName: \"kubernetes.io/projected/244b353d-3185-4eda-9d02-c22719d2a514-kube-api-access-75jdh\") pod \"244b353d-3185-4eda-9d02-c22719d2a514\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.314831 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-ovsdbserver-sb\") pod \"244b353d-3185-4eda-9d02-c22719d2a514\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.314860 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-config\") pod \"244b353d-3185-4eda-9d02-c22719d2a514\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.314889 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-ovsdbserver-nb\") pod \"244b353d-3185-4eda-9d02-c22719d2a514\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.315022 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-dns-svc\") pod \"244b353d-3185-4eda-9d02-c22719d2a514\" (UID: \"244b353d-3185-4eda-9d02-c22719d2a514\") " Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.322061 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/244b353d-3185-4eda-9d02-c22719d2a514-kube-api-access-75jdh" (OuterVolumeSpecName: "kube-api-access-75jdh") pod "244b353d-3185-4eda-9d02-c22719d2a514" (UID: "244b353d-3185-4eda-9d02-c22719d2a514"). InnerVolumeSpecName "kube-api-access-75jdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.358866 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "244b353d-3185-4eda-9d02-c22719d2a514" (UID: "244b353d-3185-4eda-9d02-c22719d2a514"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.360140 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "244b353d-3185-4eda-9d02-c22719d2a514" (UID: "244b353d-3185-4eda-9d02-c22719d2a514"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.365942 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-config" (OuterVolumeSpecName: "config") pod "244b353d-3185-4eda-9d02-c22719d2a514" (UID: "244b353d-3185-4eda-9d02-c22719d2a514"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.387266 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "244b353d-3185-4eda-9d02-c22719d2a514" (UID: "244b353d-3185-4eda-9d02-c22719d2a514"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.417856 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75jdh\" (UniqueName: \"kubernetes.io/projected/244b353d-3185-4eda-9d02-c22719d2a514-kube-api-access-75jdh\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.417905 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.417917 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.417942 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.417953 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/244b353d-3185-4eda-9d02-c22719d2a514-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.931287 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" event={"ID":"244b353d-3185-4eda-9d02-c22719d2a514","Type":"ContainerDied","Data":"ef1053fd063e5b924a6b44aab0d9c795ee94443263aee809025fa424b37cf93b"} Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.931360 4781 scope.go:117] "RemoveContainer" containerID="c08dfa216a14f1b5f92faf7d6e1d93ef4a950f425199cd650ab41ec13cb5de98" Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.931359 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zc9s5" Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.957890 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zc9s5"] Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.959311 4781 scope.go:117] "RemoveContainer" containerID="f056c08ca7e6fd30a7309773bf67db627a209f33f234b9349472728f3406a296" Dec 02 09:42:57 crc kubenswrapper[4781]: I1202 09:42:57.964618 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zc9s5"] Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.332227 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-t6bqc"] Dec 02 09:42:59 crc kubenswrapper[4781]: E1202 09:42:59.332621 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244b353d-3185-4eda-9d02-c22719d2a514" containerName="init" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.332637 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="244b353d-3185-4eda-9d02-c22719d2a514" containerName="init" Dec 02 09:42:59 crc kubenswrapper[4781]: E1202 09:42:59.332659 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17abf61f-c972-4ab9-87db-2083b58f69fa" containerName="mariadb-account-create-update" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.332668 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="17abf61f-c972-4ab9-87db-2083b58f69fa" containerName="mariadb-account-create-update" Dec 02 09:42:59 crc kubenswrapper[4781]: E1202 09:42:59.332695 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244b353d-3185-4eda-9d02-c22719d2a514" containerName="dnsmasq-dns" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.332703 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="244b353d-3185-4eda-9d02-c22719d2a514" containerName="dnsmasq-dns" Dec 02 09:42:59 crc kubenswrapper[4781]: E1202 09:42:59.332735 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae7a228-a973-4207-9556-debe4ae4eb2a" containerName="mariadb-database-create" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.332743 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae7a228-a973-4207-9556-debe4ae4eb2a" containerName="mariadb-database-create" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.332955 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="17abf61f-c972-4ab9-87db-2083b58f69fa" containerName="mariadb-account-create-update" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.332982 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ae7a228-a973-4207-9556-debe4ae4eb2a" containerName="mariadb-database-create" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.332997 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="244b353d-3185-4eda-9d02-c22719d2a514" containerName="dnsmasq-dns" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.333641 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-t6bqc" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.335117 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-shwpc" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.341181 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.342999 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-config-data\") pod \"glance-db-sync-t6bqc\" (UID: \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\") " pod="openstack/glance-db-sync-t6bqc" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.343037 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-combined-ca-bundle\") pod \"glance-db-sync-t6bqc\" (UID: \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\") " pod="openstack/glance-db-sync-t6bqc" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.343143 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-db-sync-config-data\") pod \"glance-db-sync-t6bqc\" (UID: \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\") " pod="openstack/glance-db-sync-t6bqc" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.343457 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zflkj\" (UniqueName: \"kubernetes.io/projected/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-kube-api-access-zflkj\") pod \"glance-db-sync-t6bqc\" (UID: \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\") " pod="openstack/glance-db-sync-t6bqc" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.352361 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-t6bqc"] Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.445273 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zflkj\" (UniqueName: \"kubernetes.io/projected/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-kube-api-access-zflkj\") pod \"glance-db-sync-t6bqc\" (UID: \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\") " pod="openstack/glance-db-sync-t6bqc" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.445361 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-config-data\") pod \"glance-db-sync-t6bqc\" (UID: \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\") " pod="openstack/glance-db-sync-t6bqc" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.445387 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-combined-ca-bundle\") pod \"glance-db-sync-t6bqc\" (UID: \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\") " pod="openstack/glance-db-sync-t6bqc" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.445421 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-db-sync-config-data\") pod \"glance-db-sync-t6bqc\" (UID: \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\") " pod="openstack/glance-db-sync-t6bqc" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.451828 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-config-data\") pod \"glance-db-sync-t6bqc\" (UID: \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\") " pod="openstack/glance-db-sync-t6bqc" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.452020 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-combined-ca-bundle\") pod \"glance-db-sync-t6bqc\" (UID: \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\") " pod="openstack/glance-db-sync-t6bqc" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.452058 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-db-sync-config-data\") pod \"glance-db-sync-t6bqc\" (UID: \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\") " pod="openstack/glance-db-sync-t6bqc" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.465082 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zflkj\" (UniqueName: \"kubernetes.io/projected/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-kube-api-access-zflkj\") pod \"glance-db-sync-t6bqc\" (UID: \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\") " pod="openstack/glance-db-sync-t6bqc" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.531211 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="244b353d-3185-4eda-9d02-c22719d2a514" path="/var/lib/kubelet/pods/244b353d-3185-4eda-9d02-c22719d2a514/volumes" Dec 02 09:42:59 crc kubenswrapper[4781]: I1202 09:42:59.654413 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-t6bqc" Dec 02 09:43:00 crc kubenswrapper[4781]: I1202 09:43:00.174101 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-t6bqc"] Dec 02 09:43:00 crc kubenswrapper[4781]: W1202 09:43:00.193872 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea3e7049_869e_4822_a11e_9eb2df6e3eeb.slice/crio-28468699047a557e85e4629b429c8ba7d23c6ed5f5533938aeea874b6c823b11 WatchSource:0}: Error finding container 28468699047a557e85e4629b429c8ba7d23c6ed5f5533938aeea874b6c823b11: Status 404 returned error can't find the container with id 28468699047a557e85e4629b429c8ba7d23c6ed5f5533938aeea874b6c823b11 Dec 02 09:43:00 crc kubenswrapper[4781]: I1202 09:43:00.958322 4781 generic.go:334] "Generic (PLEG): container finished" podID="d15a32bb-043f-4019-8ddb-4bcc54b243a0" containerID="16bc06b62e8ef9a8685661b61fe145273fa47eb2e965a6a47a64679324ce503c" exitCode=0 Dec 02 09:43:00 crc kubenswrapper[4781]: I1202 09:43:00.958629 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-p9svs" event={"ID":"d15a32bb-043f-4019-8ddb-4bcc54b243a0","Type":"ContainerDied","Data":"16bc06b62e8ef9a8685661b61fe145273fa47eb2e965a6a47a64679324ce503c"} Dec 02 09:43:00 crc kubenswrapper[4781]: I1202 09:43:00.959458 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-t6bqc" event={"ID":"ea3e7049-869e-4822-a11e-9eb2df6e3eeb","Type":"ContainerStarted","Data":"28468699047a557e85e4629b429c8ba7d23c6ed5f5533938aeea874b6c823b11"} Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.343065 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.388979 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-dispersionconf\") pod \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.389279 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d15a32bb-043f-4019-8ddb-4bcc54b243a0-etc-swift\") pod \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.389397 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-swiftconf\") pod \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.389563 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-combined-ca-bundle\") pod \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.389680 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d15a32bb-043f-4019-8ddb-4bcc54b243a0-ring-data-devices\") pod \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.389824 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d15a32bb-043f-4019-8ddb-4bcc54b243a0-scripts\") pod \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.390050 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt87m\" (UniqueName: \"kubernetes.io/projected/d15a32bb-043f-4019-8ddb-4bcc54b243a0-kube-api-access-tt87m\") pod \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\" (UID: \"d15a32bb-043f-4019-8ddb-4bcc54b243a0\") " Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.390222 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d15a32bb-043f-4019-8ddb-4bcc54b243a0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d15a32bb-043f-4019-8ddb-4bcc54b243a0" (UID: "d15a32bb-043f-4019-8ddb-4bcc54b243a0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.390338 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15a32bb-043f-4019-8ddb-4bcc54b243a0-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d15a32bb-043f-4019-8ddb-4bcc54b243a0" (UID: "d15a32bb-043f-4019-8ddb-4bcc54b243a0"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.390624 4781 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d15a32bb-043f-4019-8ddb-4bcc54b243a0-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.390705 4781 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d15a32bb-043f-4019-8ddb-4bcc54b243a0-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.398241 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d15a32bb-043f-4019-8ddb-4bcc54b243a0-kube-api-access-tt87m" (OuterVolumeSpecName: "kube-api-access-tt87m") pod "d15a32bb-043f-4019-8ddb-4bcc54b243a0" (UID: "d15a32bb-043f-4019-8ddb-4bcc54b243a0"). InnerVolumeSpecName "kube-api-access-tt87m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.399298 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d15a32bb-043f-4019-8ddb-4bcc54b243a0" (UID: "d15a32bb-043f-4019-8ddb-4bcc54b243a0"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.415317 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d15a32bb-043f-4019-8ddb-4bcc54b243a0" (UID: "d15a32bb-043f-4019-8ddb-4bcc54b243a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.423886 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15a32bb-043f-4019-8ddb-4bcc54b243a0-scripts" (OuterVolumeSpecName: "scripts") pod "d15a32bb-043f-4019-8ddb-4bcc54b243a0" (UID: "d15a32bb-043f-4019-8ddb-4bcc54b243a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.427133 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d15a32bb-043f-4019-8ddb-4bcc54b243a0" (UID: "d15a32bb-043f-4019-8ddb-4bcc54b243a0"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.492064 4781 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.492361 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.492376 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d15a32bb-043f-4019-8ddb-4bcc54b243a0-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.492390 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt87m\" (UniqueName: \"kubernetes.io/projected/d15a32bb-043f-4019-8ddb-4bcc54b243a0-kube-api-access-tt87m\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.492404 4781 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d15a32bb-043f-4019-8ddb-4bcc54b243a0-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.897348 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.903592 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8cf953d-c4c9-457e-956c-d2942b56499b-etc-swift\") pod \"swift-storage-0\" (UID: \"d8cf953d-c4c9-457e-956c-d2942b56499b\") " pod="openstack/swift-storage-0" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.934331 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.984791 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-p9svs" event={"ID":"d15a32bb-043f-4019-8ddb-4bcc54b243a0","Type":"ContainerDied","Data":"4cb949438d24a9a92e53118103f1c94da6daf9d2a8f86ec45736a385b501102e"} Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.984834 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cb949438d24a9a92e53118103f1c94da6daf9d2a8f86ec45736a385b501102e" Dec 02 09:43:02 crc kubenswrapper[4781]: I1202 09:43:02.984892 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p9svs" Dec 02 09:43:03 crc kubenswrapper[4781]: I1202 09:43:03.512735 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 02 09:43:03 crc kubenswrapper[4781]: I1202 09:43:03.993171 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d8cf953d-c4c9-457e-956c-d2942b56499b","Type":"ContainerStarted","Data":"148209dff203dde08174af36db8cc93cdac88aca8a0b951e97b61f80114ddd8d"} Dec 02 09:43:05 crc kubenswrapper[4781]: I1202 09:43:05.274375 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mmhrv" podUID="7cf924dd-8243-4263-85a2-68ac01fd5346" containerName="ovn-controller" probeResult="failure" output=< Dec 02 09:43:05 crc kubenswrapper[4781]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 02 09:43:05 crc kubenswrapper[4781]: > Dec 02 09:43:05 crc kubenswrapper[4781]: I1202 09:43:05.357906 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:43:06 crc kubenswrapper[4781]: I1202 09:43:06.009251 4781 generic.go:334] "Generic (PLEG): container finished" podID="4d040259-d968-45a1-832a-45586a9fe0d1" containerID="b659230fc0458ff9201681c08c775d787b2461d03f85fed77a3b501a5314c78e" exitCode=0 Dec 02 09:43:06 crc kubenswrapper[4781]: I1202 09:43:06.009302 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4d040259-d968-45a1-832a-45586a9fe0d1","Type":"ContainerDied","Data":"b659230fc0458ff9201681c08c775d787b2461d03f85fed77a3b501a5314c78e"} Dec 02 09:43:07 crc kubenswrapper[4781]: I1202 09:43:07.028609 4781 generic.go:334] "Generic (PLEG): container finished" podID="9ef9f823-7456-4494-85c7-d29fcf35e7b5" containerID="cd33af3c334721c99ee5f6eebb56a79b04c6781e3be5fde4010c62024eb59205" exitCode=0 Dec 02 09:43:07 crc kubenswrapper[4781]: I1202 09:43:07.028700 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9ef9f823-7456-4494-85c7-d29fcf35e7b5","Type":"ContainerDied","Data":"cd33af3c334721c99ee5f6eebb56a79b04c6781e3be5fde4010c62024eb59205"} Dec 02 09:43:07 crc kubenswrapper[4781]: I1202 09:43:07.034796 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4d040259-d968-45a1-832a-45586a9fe0d1","Type":"ContainerStarted","Data":"29d9e8eaac33a1eec6f376ec46ba525e7cb820604a4b8f6d204d2bf0ffeea97d"} Dec 02 09:43:07 crc kubenswrapper[4781]: I1202 09:43:07.035674 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:43:07 crc kubenswrapper[4781]: I1202 09:43:07.096666 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.986675481 podStartE2EDuration="1m48.096481154s" podCreationTimestamp="2025-12-02 09:41:19 +0000 UTC" firstStartedPulling="2025-12-02 09:41:21.167015319 +0000 UTC m=+1243.990889198" lastFinishedPulling="2025-12-02 09:42:31.276821002 +0000 UTC m=+1314.100694871" observedRunningTime="2025-12-02 09:43:07.093884754 +0000 UTC m=+1349.917758633" watchObservedRunningTime="2025-12-02 09:43:07.096481154 +0000 UTC m=+1349.920355033" Dec 02 09:43:08 crc kubenswrapper[4781]: I1202 09:43:08.058200 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9ef9f823-7456-4494-85c7-d29fcf35e7b5","Type":"ContainerStarted","Data":"8859e4a4c2bfa9375f419be864c8f71d03a958b1c01be72f1c1bf549a3c73cec"} Dec 02 09:43:08 crc kubenswrapper[4781]: I1202 09:43:08.059046 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 02 09:43:08 crc kubenswrapper[4781]: I1202 09:43:08.096866 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371926.75793 podStartE2EDuration="1m50.096845108s" podCreationTimestamp="2025-12-02 09:41:18 +0000 UTC" firstStartedPulling="2025-12-02 09:41:20.876311769 +0000 UTC m=+1243.700185648" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:43:08.094519386 +0000 UTC m=+1350.918393265" watchObservedRunningTime="2025-12-02 09:43:08.096845108 +0000 UTC m=+1350.920718997" Dec 02 09:43:09 crc kubenswrapper[4781]: I1202 09:43:09.080729 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d8cf953d-c4c9-457e-956c-d2942b56499b","Type":"ContainerStarted","Data":"a2d2dfea44334dd901255bfd17a1f32752530db98711f8fefbfe77cedc1363bc"} Dec 02 09:43:10 crc kubenswrapper[4781]: I1202 09:43:10.124373 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d8cf953d-c4c9-457e-956c-d2942b56499b","Type":"ContainerStarted","Data":"a5028bc2a564029b9534f08567ac0503db769ebd8f1db4c242ab9859739c4a63"} Dec 02 09:43:10 crc kubenswrapper[4781]: I1202 09:43:10.124422 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d8cf953d-c4c9-457e-956c-d2942b56499b","Type":"ContainerStarted","Data":"051875485b865c811405648edbaa84bc78ccf2c37560c8642d7784f331b4b83c"} Dec 02 09:43:10 crc kubenswrapper[4781]: I1202 09:43:10.567171 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mmhrv" podUID="7cf924dd-8243-4263-85a2-68ac01fd5346" containerName="ovn-controller" probeResult="failure" output=< Dec 02 09:43:10 crc kubenswrapper[4781]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 02 09:43:10 crc kubenswrapper[4781]: > Dec 02 09:43:10 crc kubenswrapper[4781]: I1202 09:43:10.829555 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rh5wv" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.152645 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d8cf953d-c4c9-457e-956c-d2942b56499b","Type":"ContainerStarted","Data":"5b8741299a307c13ecbddc05cd1df16c24f849bc3ec57054db350d70297d9c63"} Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.224624 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mmhrv-config-qb4xc"] Dec 02 09:43:11 crc kubenswrapper[4781]: E1202 09:43:11.225110 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15a32bb-043f-4019-8ddb-4bcc54b243a0" containerName="swift-ring-rebalance" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.225134 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15a32bb-043f-4019-8ddb-4bcc54b243a0" containerName="swift-ring-rebalance" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.225364 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15a32bb-043f-4019-8ddb-4bcc54b243a0" containerName="swift-ring-rebalance" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.226024 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.235507 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mmhrv-config-qb4xc"] Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.240320 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.420124 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54ea403b-dcc6-4ba9-a133-659e534c43be-scripts\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.420491 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/54ea403b-dcc6-4ba9-a133-659e534c43be-additional-scripts\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.420628 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwnb7\" (UniqueName: \"kubernetes.io/projected/54ea403b-dcc6-4ba9-a133-659e534c43be-kube-api-access-kwnb7\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.420808 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-log-ovn\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.420855 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-run-ovn\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.420880 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-run\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.523842 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-log-ovn\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.523898 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-run-ovn\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.523953 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-run\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.524031 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54ea403b-dcc6-4ba9-a133-659e534c43be-scripts\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.524254 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-log-ovn\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.524313 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-run\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.524990 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/54ea403b-dcc6-4ba9-a133-659e534c43be-additional-scripts\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.525077 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-run-ovn\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.526085 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54ea403b-dcc6-4ba9-a133-659e534c43be-scripts\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.526177 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/54ea403b-dcc6-4ba9-a133-659e534c43be-additional-scripts\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.526354 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwnb7\" (UniqueName: \"kubernetes.io/projected/54ea403b-dcc6-4ba9-a133-659e534c43be-kube-api-access-kwnb7\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.564481 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwnb7\" (UniqueName: \"kubernetes.io/projected/54ea403b-dcc6-4ba9-a133-659e534c43be-kube-api-access-kwnb7\") pod \"ovn-controller-mmhrv-config-qb4xc\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:11 crc kubenswrapper[4781]: I1202 09:43:11.864530 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:12 crc kubenswrapper[4781]: I1202 09:43:12.604651 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mmhrv-config-qb4xc"] Dec 02 09:43:12 crc kubenswrapper[4781]: W1202 09:43:12.627122 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54ea403b_dcc6_4ba9_a133_659e534c43be.slice/crio-3bd2c02b0d70459ec2a4c5e7acc6e3476796e637035c0915e9c168ba760bd930 WatchSource:0}: Error finding container 3bd2c02b0d70459ec2a4c5e7acc6e3476796e637035c0915e9c168ba760bd930: Status 404 returned error can't find the container with id 3bd2c02b0d70459ec2a4c5e7acc6e3476796e637035c0915e9c168ba760bd930 Dec 02 09:43:13 crc kubenswrapper[4781]: I1202 09:43:13.251463 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mmhrv-config-qb4xc" event={"ID":"54ea403b-dcc6-4ba9-a133-659e534c43be","Type":"ContainerStarted","Data":"210fc2f90a1c188d5fb9db839b1dec0677c6719e34d607855d194414190ddb9c"} Dec 02 09:43:13 crc kubenswrapper[4781]: I1202 09:43:13.251773 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mmhrv-config-qb4xc" event={"ID":"54ea403b-dcc6-4ba9-a133-659e534c43be","Type":"ContainerStarted","Data":"3bd2c02b0d70459ec2a4c5e7acc6e3476796e637035c0915e9c168ba760bd930"} Dec 02 09:43:13 crc kubenswrapper[4781]: I1202 09:43:13.282969 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-mmhrv-config-qb4xc" podStartSLOduration=2.28291752 podStartE2EDuration="2.28291752s" podCreationTimestamp="2025-12-02 09:43:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:43:13.273390583 +0000 UTC m=+1356.097264462" watchObservedRunningTime="2025-12-02 09:43:13.28291752 +0000 UTC m=+1356.106791409" Dec 02 09:43:14 crc kubenswrapper[4781]: I1202 09:43:14.268254 4781 generic.go:334] "Generic (PLEG): container finished" podID="54ea403b-dcc6-4ba9-a133-659e534c43be" containerID="210fc2f90a1c188d5fb9db839b1dec0677c6719e34d607855d194414190ddb9c" exitCode=0 Dec 02 09:43:14 crc kubenswrapper[4781]: I1202 09:43:14.268343 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mmhrv-config-qb4xc" event={"ID":"54ea403b-dcc6-4ba9-a133-659e534c43be","Type":"ContainerDied","Data":"210fc2f90a1c188d5fb9db839b1dec0677c6719e34d607855d194414190ddb9c"} Dec 02 09:43:14 crc kubenswrapper[4781]: I1202 09:43:14.295711 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d8cf953d-c4c9-457e-956c-d2942b56499b","Type":"ContainerStarted","Data":"1cf05e9f5cffcf5c79c526b045b8f8358148d79245f815d2d5a4e4159f285fbc"} Dec 02 09:43:14 crc kubenswrapper[4781]: I1202 09:43:14.295762 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d8cf953d-c4c9-457e-956c-d2942b56499b","Type":"ContainerStarted","Data":"6557b1c064575519f52a437d4bbf5fd9384f0795710ae6806166b069f48668d4"} Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.307154 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-mmhrv" Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.311565 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d8cf953d-c4c9-457e-956c-d2942b56499b","Type":"ContainerStarted","Data":"704a80455fb6ed76a5ef2a6a5bf899ff31b385d76f5dbd8e907b2e34e996c3a9"} Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.311594 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d8cf953d-c4c9-457e-956c-d2942b56499b","Type":"ContainerStarted","Data":"22150e37cae9d38f9aa86bf21e98ee20ed4e3ab310d81e10af42b6aaf80d06fb"} Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.820392 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.883477 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54ea403b-dcc6-4ba9-a133-659e534c43be-scripts\") pod \"54ea403b-dcc6-4ba9-a133-659e534c43be\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.883572 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/54ea403b-dcc6-4ba9-a133-659e534c43be-additional-scripts\") pod \"54ea403b-dcc6-4ba9-a133-659e534c43be\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.883623 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-log-ovn\") pod \"54ea403b-dcc6-4ba9-a133-659e534c43be\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.883658 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwnb7\" (UniqueName: \"kubernetes.io/projected/54ea403b-dcc6-4ba9-a133-659e534c43be-kube-api-access-kwnb7\") pod \"54ea403b-dcc6-4ba9-a133-659e534c43be\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.883701 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-run-ovn\") pod \"54ea403b-dcc6-4ba9-a133-659e534c43be\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.883736 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-run\") pod \"54ea403b-dcc6-4ba9-a133-659e534c43be\" (UID: \"54ea403b-dcc6-4ba9-a133-659e534c43be\") " Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.884336 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-run" (OuterVolumeSpecName: "var-run") pod "54ea403b-dcc6-4ba9-a133-659e534c43be" (UID: "54ea403b-dcc6-4ba9-a133-659e534c43be"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.884387 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "54ea403b-dcc6-4ba9-a133-659e534c43be" (UID: "54ea403b-dcc6-4ba9-a133-659e534c43be"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.884798 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54ea403b-dcc6-4ba9-a133-659e534c43be-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "54ea403b-dcc6-4ba9-a133-659e534c43be" (UID: "54ea403b-dcc6-4ba9-a133-659e534c43be"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.885074 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "54ea403b-dcc6-4ba9-a133-659e534c43be" (UID: "54ea403b-dcc6-4ba9-a133-659e534c43be"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.885149 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54ea403b-dcc6-4ba9-a133-659e534c43be-scripts" (OuterVolumeSpecName: "scripts") pod "54ea403b-dcc6-4ba9-a133-659e534c43be" (UID: "54ea403b-dcc6-4ba9-a133-659e534c43be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.908219 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ea403b-dcc6-4ba9-a133-659e534c43be-kube-api-access-kwnb7" (OuterVolumeSpecName: "kube-api-access-kwnb7") pod "54ea403b-dcc6-4ba9-a133-659e534c43be" (UID: "54ea403b-dcc6-4ba9-a133-659e534c43be"). InnerVolumeSpecName "kube-api-access-kwnb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.985795 4781 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.985832 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwnb7\" (UniqueName: \"kubernetes.io/projected/54ea403b-dcc6-4ba9-a133-659e534c43be-kube-api-access-kwnb7\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.985842 4781 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.985851 4781 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/54ea403b-dcc6-4ba9-a133-659e534c43be-var-run\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.985859 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54ea403b-dcc6-4ba9-a133-659e534c43be-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:15 crc kubenswrapper[4781]: I1202 09:43:15.985867 4781 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/54ea403b-dcc6-4ba9-a133-659e534c43be-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:16 crc kubenswrapper[4781]: I1202 09:43:16.352759 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mmhrv-config-qb4xc" event={"ID":"54ea403b-dcc6-4ba9-a133-659e534c43be","Type":"ContainerDied","Data":"3bd2c02b0d70459ec2a4c5e7acc6e3476796e637035c0915e9c168ba760bd930"} Dec 02 09:43:16 crc kubenswrapper[4781]: I1202 09:43:16.352809 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bd2c02b0d70459ec2a4c5e7acc6e3476796e637035c0915e9c168ba760bd930" Dec 02 09:43:16 crc kubenswrapper[4781]: I1202 09:43:16.352885 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mmhrv-config-qb4xc" Dec 02 09:43:16 crc kubenswrapper[4781]: I1202 09:43:16.964230 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mmhrv-config-qb4xc"] Dec 02 09:43:16 crc kubenswrapper[4781]: I1202 09:43:16.972623 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mmhrv-config-qb4xc"] Dec 02 09:43:17 crc kubenswrapper[4781]: I1202 09:43:17.516663 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54ea403b-dcc6-4ba9-a133-659e534c43be" path="/var/lib/kubelet/pods/54ea403b-dcc6-4ba9-a133-659e534c43be/volumes" Dec 02 09:43:20 crc kubenswrapper[4781]: I1202 09:43:20.632122 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 02 09:43:20 crc kubenswrapper[4781]: I1202 09:43:20.703115 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:43:20 crc kubenswrapper[4781]: I1202 09:43:20.924509 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-l8cvh"] Dec 02 09:43:20 crc kubenswrapper[4781]: E1202 09:43:20.924877 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ea403b-dcc6-4ba9-a133-659e534c43be" containerName="ovn-config" Dec 02 09:43:20 crc kubenswrapper[4781]: I1202 09:43:20.924892 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ea403b-dcc6-4ba9-a133-659e534c43be" containerName="ovn-config" Dec 02 09:43:20 crc kubenswrapper[4781]: I1202 09:43:20.925115 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ea403b-dcc6-4ba9-a133-659e534c43be" containerName="ovn-config" Dec 02 09:43:20 crc kubenswrapper[4781]: I1202 09:43:20.925691 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l8cvh" Dec 02 09:43:20 crc kubenswrapper[4781]: I1202 09:43:20.949614 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-l8cvh"] Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.040427 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-21eb-account-create-update-7w4wm"] Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.041649 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-21eb-account-create-update-7w4wm" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.051345 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.056194 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-21eb-account-create-update-7w4wm"] Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.070577 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfxfp\" (UniqueName: \"kubernetes.io/projected/d8b40bb3-6c5d-47d5-9e15-439119be130a-kube-api-access-xfxfp\") pod \"barbican-db-create-l8cvh\" (UID: \"d8b40bb3-6c5d-47d5-9e15-439119be130a\") " pod="openstack/barbican-db-create-l8cvh" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.070667 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8b40bb3-6c5d-47d5-9e15-439119be130a-operator-scripts\") pod \"barbican-db-create-l8cvh\" (UID: \"d8b40bb3-6c5d-47d5-9e15-439119be130a\") " pod="openstack/barbican-db-create-l8cvh" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.155962 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-6tfv2"] Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.157320 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6tfv2" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.169428 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6tfv2"] Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.173456 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c457c92b-cc50-4e99-bf06-81d1b6cf3a7c-operator-scripts\") pod \"barbican-21eb-account-create-update-7w4wm\" (UID: \"c457c92b-cc50-4e99-bf06-81d1b6cf3a7c\") " pod="openstack/barbican-21eb-account-create-update-7w4wm" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.173548 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfxfp\" (UniqueName: \"kubernetes.io/projected/d8b40bb3-6c5d-47d5-9e15-439119be130a-kube-api-access-xfxfp\") pod \"barbican-db-create-l8cvh\" (UID: \"d8b40bb3-6c5d-47d5-9e15-439119be130a\") " pod="openstack/barbican-db-create-l8cvh" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.173580 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngvm5\" (UniqueName: \"kubernetes.io/projected/c457c92b-cc50-4e99-bf06-81d1b6cf3a7c-kube-api-access-ngvm5\") pod \"barbican-21eb-account-create-update-7w4wm\" (UID: \"c457c92b-cc50-4e99-bf06-81d1b6cf3a7c\") " pod="openstack/barbican-21eb-account-create-update-7w4wm" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.173605 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8b40bb3-6c5d-47d5-9e15-439119be130a-operator-scripts\") pod \"barbican-db-create-l8cvh\" (UID: \"d8b40bb3-6c5d-47d5-9e15-439119be130a\") " pod="openstack/barbican-db-create-l8cvh" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.174362 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8b40bb3-6c5d-47d5-9e15-439119be130a-operator-scripts\") pod \"barbican-db-create-l8cvh\" (UID: \"d8b40bb3-6c5d-47d5-9e15-439119be130a\") " pod="openstack/barbican-db-create-l8cvh" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.223152 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfxfp\" (UniqueName: \"kubernetes.io/projected/d8b40bb3-6c5d-47d5-9e15-439119be130a-kube-api-access-xfxfp\") pod \"barbican-db-create-l8cvh\" (UID: \"d8b40bb3-6c5d-47d5-9e15-439119be130a\") " pod="openstack/barbican-db-create-l8cvh" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.248536 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l8cvh" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.280553 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c457c92b-cc50-4e99-bf06-81d1b6cf3a7c-operator-scripts\") pod \"barbican-21eb-account-create-update-7w4wm\" (UID: \"c457c92b-cc50-4e99-bf06-81d1b6cf3a7c\") " pod="openstack/barbican-21eb-account-create-update-7w4wm" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.280676 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8118db55-54ba-4cf2-b80d-266872f87896-operator-scripts\") pod \"cinder-db-create-6tfv2\" (UID: \"8118db55-54ba-4cf2-b80d-266872f87896\") " pod="openstack/cinder-db-create-6tfv2" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.280730 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngvm5\" (UniqueName: \"kubernetes.io/projected/c457c92b-cc50-4e99-bf06-81d1b6cf3a7c-kube-api-access-ngvm5\") pod \"barbican-21eb-account-create-update-7w4wm\" (UID: \"c457c92b-cc50-4e99-bf06-81d1b6cf3a7c\") " pod="openstack/barbican-21eb-account-create-update-7w4wm" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.280784 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhq5k\" (UniqueName: \"kubernetes.io/projected/8118db55-54ba-4cf2-b80d-266872f87896-kube-api-access-xhq5k\") pod \"cinder-db-create-6tfv2\" (UID: \"8118db55-54ba-4cf2-b80d-266872f87896\") " pod="openstack/cinder-db-create-6tfv2" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.280940 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-5mr6j"] Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.281970 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5mr6j" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.288165 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.288509 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q5lzd" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.290939 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.291974 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.294879 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c457c92b-cc50-4e99-bf06-81d1b6cf3a7c-operator-scripts\") pod \"barbican-21eb-account-create-update-7w4wm\" (UID: \"c457c92b-cc50-4e99-bf06-81d1b6cf3a7c\") " pod="openstack/barbican-21eb-account-create-update-7w4wm" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.308900 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngvm5\" (UniqueName: \"kubernetes.io/projected/c457c92b-cc50-4e99-bf06-81d1b6cf3a7c-kube-api-access-ngvm5\") pod \"barbican-21eb-account-create-update-7w4wm\" (UID: \"c457c92b-cc50-4e99-bf06-81d1b6cf3a7c\") " pod="openstack/barbican-21eb-account-create-update-7w4wm" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.316108 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-5mr6j"] Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.327265 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-78xmv"] Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.328276 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-78xmv" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.367535 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-78xmv"] Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.371701 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-21eb-account-create-update-7w4wm" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.381730 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fckts\" (UniqueName: \"kubernetes.io/projected/2075ad57-12d9-47c4-80ba-5bf9e1bce693-kube-api-access-fckts\") pod \"keystone-db-sync-5mr6j\" (UID: \"2075ad57-12d9-47c4-80ba-5bf9e1bce693\") " pod="openstack/keystone-db-sync-5mr6j" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.381842 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8118db55-54ba-4cf2-b80d-266872f87896-operator-scripts\") pod \"cinder-db-create-6tfv2\" (UID: \"8118db55-54ba-4cf2-b80d-266872f87896\") " pod="openstack/cinder-db-create-6tfv2" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.381903 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhq5k\" (UniqueName: \"kubernetes.io/projected/8118db55-54ba-4cf2-b80d-266872f87896-kube-api-access-xhq5k\") pod \"cinder-db-create-6tfv2\" (UID: \"8118db55-54ba-4cf2-b80d-266872f87896\") " pod="openstack/cinder-db-create-6tfv2" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.381966 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2075ad57-12d9-47c4-80ba-5bf9e1bce693-combined-ca-bundle\") pod \"keystone-db-sync-5mr6j\" (UID: \"2075ad57-12d9-47c4-80ba-5bf9e1bce693\") " pod="openstack/keystone-db-sync-5mr6j" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.382293 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2075ad57-12d9-47c4-80ba-5bf9e1bce693-config-data\") pod \"keystone-db-sync-5mr6j\" (UID: \"2075ad57-12d9-47c4-80ba-5bf9e1bce693\") " pod="openstack/keystone-db-sync-5mr6j" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.382572 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8118db55-54ba-4cf2-b80d-266872f87896-operator-scripts\") pod \"cinder-db-create-6tfv2\" (UID: \"8118db55-54ba-4cf2-b80d-266872f87896\") " pod="openstack/cinder-db-create-6tfv2" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.385149 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-3506-account-create-update-qrvct"] Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.386604 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3506-account-create-update-qrvct" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.393242 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.401668 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhq5k\" (UniqueName: \"kubernetes.io/projected/8118db55-54ba-4cf2-b80d-266872f87896-kube-api-access-xhq5k\") pod \"cinder-db-create-6tfv2\" (UID: \"8118db55-54ba-4cf2-b80d-266872f87896\") " pod="openstack/cinder-db-create-6tfv2" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.402696 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3506-account-create-update-qrvct"] Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.451392 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a8e2-account-create-update-g5tjv"] Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.452629 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a8e2-account-create-update-g5tjv" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.454743 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.461015 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a8e2-account-create-update-g5tjv"] Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.484386 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fckts\" (UniqueName: \"kubernetes.io/projected/2075ad57-12d9-47c4-80ba-5bf9e1bce693-kube-api-access-fckts\") pod \"keystone-db-sync-5mr6j\" (UID: \"2075ad57-12d9-47c4-80ba-5bf9e1bce693\") " pod="openstack/keystone-db-sync-5mr6j" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.484423 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7ff12ef-2f31-42e6-b5e0-8b3172ac738b-operator-scripts\") pod \"neutron-3506-account-create-update-qrvct\" (UID: \"f7ff12ef-2f31-42e6-b5e0-8b3172ac738b\") " pod="openstack/neutron-3506-account-create-update-qrvct" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.484448 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/158568be-9a60-4a86-b480-e5c90939ed09-operator-scripts\") pod \"neutron-db-create-78xmv\" (UID: \"158568be-9a60-4a86-b480-e5c90939ed09\") " pod="openstack/neutron-db-create-78xmv" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.484504 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtjtq\" (UniqueName: \"kubernetes.io/projected/158568be-9a60-4a86-b480-e5c90939ed09-kube-api-access-dtjtq\") pod \"neutron-db-create-78xmv\" (UID: \"158568be-9a60-4a86-b480-e5c90939ed09\") " pod="openstack/neutron-db-create-78xmv" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.484551 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2075ad57-12d9-47c4-80ba-5bf9e1bce693-combined-ca-bundle\") pod \"keystone-db-sync-5mr6j\" (UID: \"2075ad57-12d9-47c4-80ba-5bf9e1bce693\") " pod="openstack/keystone-db-sync-5mr6j" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.484572 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2075ad57-12d9-47c4-80ba-5bf9e1bce693-config-data\") pod \"keystone-db-sync-5mr6j\" (UID: \"2075ad57-12d9-47c4-80ba-5bf9e1bce693\") " pod="openstack/keystone-db-sync-5mr6j" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.484601 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5tld\" (UniqueName: \"kubernetes.io/projected/f7ff12ef-2f31-42e6-b5e0-8b3172ac738b-kube-api-access-d5tld\") pod \"neutron-3506-account-create-update-qrvct\" (UID: \"f7ff12ef-2f31-42e6-b5e0-8b3172ac738b\") " pod="openstack/neutron-3506-account-create-update-qrvct" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.488351 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2075ad57-12d9-47c4-80ba-5bf9e1bce693-combined-ca-bundle\") pod \"keystone-db-sync-5mr6j\" (UID: \"2075ad57-12d9-47c4-80ba-5bf9e1bce693\") " pod="openstack/keystone-db-sync-5mr6j" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.500438 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6tfv2" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.503591 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2075ad57-12d9-47c4-80ba-5bf9e1bce693-config-data\") pod \"keystone-db-sync-5mr6j\" (UID: \"2075ad57-12d9-47c4-80ba-5bf9e1bce693\") " pod="openstack/keystone-db-sync-5mr6j" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.525726 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fckts\" (UniqueName: \"kubernetes.io/projected/2075ad57-12d9-47c4-80ba-5bf9e1bce693-kube-api-access-fckts\") pod \"keystone-db-sync-5mr6j\" (UID: \"2075ad57-12d9-47c4-80ba-5bf9e1bce693\") " pod="openstack/keystone-db-sync-5mr6j" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.587971 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5tld\" (UniqueName: \"kubernetes.io/projected/f7ff12ef-2f31-42e6-b5e0-8b3172ac738b-kube-api-access-d5tld\") pod \"neutron-3506-account-create-update-qrvct\" (UID: \"f7ff12ef-2f31-42e6-b5e0-8b3172ac738b\") " pod="openstack/neutron-3506-account-create-update-qrvct" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.588776 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae522302-ae88-485a-a767-d6a6c6bf5205-operator-scripts\") pod \"cinder-a8e2-account-create-update-g5tjv\" (UID: \"ae522302-ae88-485a-a767-d6a6c6bf5205\") " pod="openstack/cinder-a8e2-account-create-update-g5tjv" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.588808 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckdz4\" (UniqueName: \"kubernetes.io/projected/ae522302-ae88-485a-a767-d6a6c6bf5205-kube-api-access-ckdz4\") pod \"cinder-a8e2-account-create-update-g5tjv\" (UID: \"ae522302-ae88-485a-a767-d6a6c6bf5205\") " pod="openstack/cinder-a8e2-account-create-update-g5tjv" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.588874 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7ff12ef-2f31-42e6-b5e0-8b3172ac738b-operator-scripts\") pod \"neutron-3506-account-create-update-qrvct\" (UID: \"f7ff12ef-2f31-42e6-b5e0-8b3172ac738b\") " pod="openstack/neutron-3506-account-create-update-qrvct" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.588915 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/158568be-9a60-4a86-b480-e5c90939ed09-operator-scripts\") pod \"neutron-db-create-78xmv\" (UID: \"158568be-9a60-4a86-b480-e5c90939ed09\") " pod="openstack/neutron-db-create-78xmv" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.589010 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtjtq\" (UniqueName: \"kubernetes.io/projected/158568be-9a60-4a86-b480-e5c90939ed09-kube-api-access-dtjtq\") pod \"neutron-db-create-78xmv\" (UID: \"158568be-9a60-4a86-b480-e5c90939ed09\") " pod="openstack/neutron-db-create-78xmv" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.589443 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7ff12ef-2f31-42e6-b5e0-8b3172ac738b-operator-scripts\") pod \"neutron-3506-account-create-update-qrvct\" (UID: \"f7ff12ef-2f31-42e6-b5e0-8b3172ac738b\") " pod="openstack/neutron-3506-account-create-update-qrvct" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.589959 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/158568be-9a60-4a86-b480-e5c90939ed09-operator-scripts\") pod \"neutron-db-create-78xmv\" (UID: \"158568be-9a60-4a86-b480-e5c90939ed09\") " pod="openstack/neutron-db-create-78xmv" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.603854 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtjtq\" (UniqueName: \"kubernetes.io/projected/158568be-9a60-4a86-b480-e5c90939ed09-kube-api-access-dtjtq\") pod \"neutron-db-create-78xmv\" (UID: \"158568be-9a60-4a86-b480-e5c90939ed09\") " pod="openstack/neutron-db-create-78xmv" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.604087 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5tld\" (UniqueName: \"kubernetes.io/projected/f7ff12ef-2f31-42e6-b5e0-8b3172ac738b-kube-api-access-d5tld\") pod \"neutron-3506-account-create-update-qrvct\" (UID: \"f7ff12ef-2f31-42e6-b5e0-8b3172ac738b\") " pod="openstack/neutron-3506-account-create-update-qrvct" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.614959 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5mr6j" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.662689 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-78xmv" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.690703 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae522302-ae88-485a-a767-d6a6c6bf5205-operator-scripts\") pod \"cinder-a8e2-account-create-update-g5tjv\" (UID: \"ae522302-ae88-485a-a767-d6a6c6bf5205\") " pod="openstack/cinder-a8e2-account-create-update-g5tjv" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.690760 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckdz4\" (UniqueName: \"kubernetes.io/projected/ae522302-ae88-485a-a767-d6a6c6bf5205-kube-api-access-ckdz4\") pod \"cinder-a8e2-account-create-update-g5tjv\" (UID: \"ae522302-ae88-485a-a767-d6a6c6bf5205\") " pod="openstack/cinder-a8e2-account-create-update-g5tjv" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.691650 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae522302-ae88-485a-a767-d6a6c6bf5205-operator-scripts\") pod \"cinder-a8e2-account-create-update-g5tjv\" (UID: \"ae522302-ae88-485a-a767-d6a6c6bf5205\") " pod="openstack/cinder-a8e2-account-create-update-g5tjv" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.712347 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckdz4\" (UniqueName: \"kubernetes.io/projected/ae522302-ae88-485a-a767-d6a6c6bf5205-kube-api-access-ckdz4\") pod \"cinder-a8e2-account-create-update-g5tjv\" (UID: \"ae522302-ae88-485a-a767-d6a6c6bf5205\") " pod="openstack/cinder-a8e2-account-create-update-g5tjv" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.772635 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3506-account-create-update-qrvct" Dec 02 09:43:21 crc kubenswrapper[4781]: I1202 09:43:21.813027 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a8e2-account-create-update-g5tjv" Dec 02 09:43:26 crc kubenswrapper[4781]: E1202 09:43:26.792514 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 02 09:43:26 crc kubenswrapper[4781]: E1202 09:43:26.793184 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zflkj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-t6bqc_openstack(ea3e7049-869e-4822-a11e-9eb2df6e3eeb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:43:26 crc kubenswrapper[4781]: E1202 09:43:26.794680 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-t6bqc" podUID="ea3e7049-869e-4822-a11e-9eb2df6e3eeb" Dec 02 09:43:26 crc kubenswrapper[4781]: E1202 09:43:26.934755 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-swift-object:current-podified" Dec 02 09:43:26 crc kubenswrapper[4781]: E1202 09:43:26.935095 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:object-server,Image:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,Command:[/usr/bin/swift-object-server /etc/swift/object-server.conf.d -v],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:object,HostPort:0,ContainerPort:6200,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b7h56h9dh94h67bh697h95h55hbh555h556h675h5fdh57dh579h5fbh64fh5c9h687hb6h678h5d4h549h54h98h8ch564h5bh5bch55dhc8hf8q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:swift,ReadOnly:false,MountPath:/srv/node/pv,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-swift,ReadOnly:false,MountPath:/etc/swift,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cache,ReadOnly:false,MountPath:/var/cache/swift,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lock,ReadOnly:false,MountPath:/var/lock,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-99krc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42445,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-storage-0_openstack(d8cf953d-c4c9-457e-956c-d2942b56499b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:43:27 crc kubenswrapper[4781]: I1202 09:43:27.352079 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-5mr6j"] Dec 02 09:43:27 crc kubenswrapper[4781]: E1202 09:43:27.526972 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-t6bqc" podUID="ea3e7049-869e-4822-a11e-9eb2df6e3eeb" Dec 02 09:43:27 crc kubenswrapper[4781]: I1202 09:43:27.588014 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5mr6j" event={"ID":"2075ad57-12d9-47c4-80ba-5bf9e1bce693","Type":"ContainerStarted","Data":"e1adfc3059cf325130aa3a214ccc3edb5d273cdee9c25f1dccc5cfc9e863a082"} Dec 02 09:43:27 crc kubenswrapper[4781]: E1202 09:43:27.634370 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"object-server\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"object-replicator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-auditor\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-updater\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"rsync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"swift-recon-cron\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\"]" pod="openstack/swift-storage-0" podUID="d8cf953d-c4c9-457e-956c-d2942b56499b" Dec 02 09:43:27 crc kubenswrapper[4781]: I1202 09:43:27.916214 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-21eb-account-create-update-7w4wm"] Dec 02 09:43:27 crc kubenswrapper[4781]: W1202 09:43:27.919943 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc457c92b_cc50_4e99_bf06_81d1b6cf3a7c.slice/crio-37868771e3ffe099212a5344c0b2900d259e87517b1b67c2051daf6e0375caa8 WatchSource:0}: Error finding container 37868771e3ffe099212a5344c0b2900d259e87517b1b67c2051daf6e0375caa8: Status 404 returned error can't find the container with id 37868771e3ffe099212a5344c0b2900d259e87517b1b67c2051daf6e0375caa8 Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.021053 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3506-account-create-update-qrvct"] Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.029723 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a8e2-account-create-update-g5tjv"] Dec 02 09:43:28 crc kubenswrapper[4781]: W1202 09:43:28.031209 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae522302_ae88_485a_a767_d6a6c6bf5205.slice/crio-807289327bf2619ea733025947105abb79197b373158f5544635763f6f19a8c8 WatchSource:0}: Error finding container 807289327bf2619ea733025947105abb79197b373158f5544635763f6f19a8c8: Status 404 returned error can't find the container with id 807289327bf2619ea733025947105abb79197b373158f5544635763f6f19a8c8 Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.039275 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-78xmv"] Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.145041 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6tfv2"] Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.155222 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-l8cvh"] Dec 02 09:43:28 crc kubenswrapper[4781]: W1202 09:43:28.163740 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8118db55_54ba_4cf2_b80d_266872f87896.slice/crio-1c48f5df3123c4dae8f38dcca96cd232a12b282f2df17ae94179b3ab45951b9b WatchSource:0}: Error finding container 1c48f5df3123c4dae8f38dcca96cd232a12b282f2df17ae94179b3ab45951b9b: Status 404 returned error can't find the container with id 1c48f5df3123c4dae8f38dcca96cd232a12b282f2df17ae94179b3ab45951b9b Dec 02 09:43:28 crc kubenswrapper[4781]: W1202 09:43:28.167168 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8b40bb3_6c5d_47d5_9e15_439119be130a.slice/crio-51959b9bcaa0a0c9709c8833956afd3a08352d5e3cf887ee6d3e541c0b3eb697 WatchSource:0}: Error finding container 51959b9bcaa0a0c9709c8833956afd3a08352d5e3cf887ee6d3e541c0b3eb697: Status 404 returned error can't find the container with id 51959b9bcaa0a0c9709c8833956afd3a08352d5e3cf887ee6d3e541c0b3eb697 Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.541849 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l8cvh" event={"ID":"d8b40bb3-6c5d-47d5-9e15-439119be130a","Type":"ContainerStarted","Data":"e0ea675c144f0a00e40bfdbd2b47f08de472673f29409e4eef5c3833990d0dfa"} Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.541909 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l8cvh" event={"ID":"d8b40bb3-6c5d-47d5-9e15-439119be130a","Type":"ContainerStarted","Data":"51959b9bcaa0a0c9709c8833956afd3a08352d5e3cf887ee6d3e541c0b3eb697"} Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.546960 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a8e2-account-create-update-g5tjv" event={"ID":"ae522302-ae88-485a-a767-d6a6c6bf5205","Type":"ContainerStarted","Data":"562daa81b295a8b189f4611d5790fad69d2756ac33da3345e4cd29fdf13d4a85"} Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.547003 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a8e2-account-create-update-g5tjv" event={"ID":"ae522302-ae88-485a-a767-d6a6c6bf5205","Type":"ContainerStarted","Data":"807289327bf2619ea733025947105abb79197b373158f5544635763f6f19a8c8"} Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.562437 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-21eb-account-create-update-7w4wm" event={"ID":"c457c92b-cc50-4e99-bf06-81d1b6cf3a7c","Type":"ContainerStarted","Data":"fa1c488c26c5961e60ec3cafe487ff7fdc7054a9dda307611a836a3ba3a14ad6"} Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.562536 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-21eb-account-create-update-7w4wm" event={"ID":"c457c92b-cc50-4e99-bf06-81d1b6cf3a7c","Type":"ContainerStarted","Data":"37868771e3ffe099212a5344c0b2900d259e87517b1b67c2051daf6e0375caa8"} Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.567362 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3506-account-create-update-qrvct" event={"ID":"f7ff12ef-2f31-42e6-b5e0-8b3172ac738b","Type":"ContainerStarted","Data":"908f120c3659c413a9d78d9d012afda775dc13af53b354e176171cfbbf0b676d"} Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.567413 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3506-account-create-update-qrvct" event={"ID":"f7ff12ef-2f31-42e6-b5e0-8b3172ac738b","Type":"ContainerStarted","Data":"bc448207c7188927ee72c8aefb0207ab9032d3e8951803a8782961158633627d"} Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.568202 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-l8cvh" podStartSLOduration=8.568179729 podStartE2EDuration="8.568179729s" podCreationTimestamp="2025-12-02 09:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:43:28.560627549 +0000 UTC m=+1371.384501428" watchObservedRunningTime="2025-12-02 09:43:28.568179729 +0000 UTC m=+1371.392053628" Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.570243 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6tfv2" event={"ID":"8118db55-54ba-4cf2-b80d-266872f87896","Type":"ContainerStarted","Data":"a6c37db8446be81eea490f38cf259d36dfe3264288300d2f3007d323fd5cc8b5"} Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.570269 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6tfv2" event={"ID":"8118db55-54ba-4cf2-b80d-266872f87896","Type":"ContainerStarted","Data":"1c48f5df3123c4dae8f38dcca96cd232a12b282f2df17ae94179b3ab45951b9b"} Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.586321 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-78xmv" event={"ID":"158568be-9a60-4a86-b480-e5c90939ed09","Type":"ContainerStarted","Data":"033c113454daf9fbb63a0f625966cc476d0d7c0541f94bd77b4ce6c94921b97c"} Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.586404 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-78xmv" event={"ID":"158568be-9a60-4a86-b480-e5c90939ed09","Type":"ContainerStarted","Data":"b6b40497cdb771fc339a499ca40c600771e662088b7bcc1dcf0d3c748244a136"} Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.592105 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d8cf953d-c4c9-457e-956c-d2942b56499b","Type":"ContainerStarted","Data":"c28ccfd8134df4e7f007228a5c86a395b430b9471c1aa010d648c14a6a7604f2"} Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.595579 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-3506-account-create-update-qrvct" podStartSLOduration=7.595561198 podStartE2EDuration="7.595561198s" podCreationTimestamp="2025-12-02 09:43:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:43:28.595023243 +0000 UTC m=+1371.418897122" watchObservedRunningTime="2025-12-02 09:43:28.595561198 +0000 UTC m=+1371.419435077" Dec 02 09:43:28 crc kubenswrapper[4781]: E1202 09:43:28.597829 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"object-server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-replicator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-auditor\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-updater\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"rsync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"swift-recon-cron\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\"]" pod="openstack/swift-storage-0" podUID="d8cf953d-c4c9-457e-956c-d2942b56499b" Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.599281 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-a8e2-account-create-update-g5tjv" podStartSLOduration=7.599259696 podStartE2EDuration="7.599259696s" podCreationTimestamp="2025-12-02 09:43:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:43:28.579951703 +0000 UTC m=+1371.403825592" watchObservedRunningTime="2025-12-02 09:43:28.599259696 +0000 UTC m=+1371.423133575" Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.617141 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-21eb-account-create-update-7w4wm" podStartSLOduration=7.617121601 podStartE2EDuration="7.617121601s" podCreationTimestamp="2025-12-02 09:43:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:43:28.609477338 +0000 UTC m=+1371.433351217" watchObservedRunningTime="2025-12-02 09:43:28.617121601 +0000 UTC m=+1371.440995480" Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.632622 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-6tfv2" podStartSLOduration=7.632600493 podStartE2EDuration="7.632600493s" podCreationTimestamp="2025-12-02 09:43:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:43:28.620909502 +0000 UTC m=+1371.444783381" watchObservedRunningTime="2025-12-02 09:43:28.632600493 +0000 UTC m=+1371.456474372" Dec 02 09:43:28 crc kubenswrapper[4781]: I1202 09:43:28.677052 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-78xmv" podStartSLOduration=7.677032915 podStartE2EDuration="7.677032915s" podCreationTimestamp="2025-12-02 09:43:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:43:28.676570603 +0000 UTC m=+1371.500444492" watchObservedRunningTime="2025-12-02 09:43:28.677032915 +0000 UTC m=+1371.500906794" Dec 02 09:43:29 crc kubenswrapper[4781]: I1202 09:43:29.626149 4781 generic.go:334] "Generic (PLEG): container finished" podID="8118db55-54ba-4cf2-b80d-266872f87896" containerID="a6c37db8446be81eea490f38cf259d36dfe3264288300d2f3007d323fd5cc8b5" exitCode=0 Dec 02 09:43:29 crc kubenswrapper[4781]: I1202 09:43:29.626444 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6tfv2" event={"ID":"8118db55-54ba-4cf2-b80d-266872f87896","Type":"ContainerDied","Data":"a6c37db8446be81eea490f38cf259d36dfe3264288300d2f3007d323fd5cc8b5"} Dec 02 09:43:29 crc kubenswrapper[4781]: I1202 09:43:29.644586 4781 generic.go:334] "Generic (PLEG): container finished" podID="158568be-9a60-4a86-b480-e5c90939ed09" containerID="033c113454daf9fbb63a0f625966cc476d0d7c0541f94bd77b4ce6c94921b97c" exitCode=0 Dec 02 09:43:29 crc kubenswrapper[4781]: I1202 09:43:29.644667 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-78xmv" event={"ID":"158568be-9a60-4a86-b480-e5c90939ed09","Type":"ContainerDied","Data":"033c113454daf9fbb63a0f625966cc476d0d7c0541f94bd77b4ce6c94921b97c"} Dec 02 09:43:29 crc kubenswrapper[4781]: I1202 09:43:29.648021 4781 generic.go:334] "Generic (PLEG): container finished" podID="d8b40bb3-6c5d-47d5-9e15-439119be130a" containerID="e0ea675c144f0a00e40bfdbd2b47f08de472673f29409e4eef5c3833990d0dfa" exitCode=0 Dec 02 09:43:29 crc kubenswrapper[4781]: I1202 09:43:29.648099 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l8cvh" event={"ID":"d8b40bb3-6c5d-47d5-9e15-439119be130a","Type":"ContainerDied","Data":"e0ea675c144f0a00e40bfdbd2b47f08de472673f29409e4eef5c3833990d0dfa"} Dec 02 09:43:29 crc kubenswrapper[4781]: I1202 09:43:29.661060 4781 generic.go:334] "Generic (PLEG): container finished" podID="ae522302-ae88-485a-a767-d6a6c6bf5205" containerID="562daa81b295a8b189f4611d5790fad69d2756ac33da3345e4cd29fdf13d4a85" exitCode=0 Dec 02 09:43:29 crc kubenswrapper[4781]: I1202 09:43:29.661147 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a8e2-account-create-update-g5tjv" event={"ID":"ae522302-ae88-485a-a767-d6a6c6bf5205","Type":"ContainerDied","Data":"562daa81b295a8b189f4611d5790fad69d2756ac33da3345e4cd29fdf13d4a85"} Dec 02 09:43:29 crc kubenswrapper[4781]: I1202 09:43:29.667336 4781 generic.go:334] "Generic (PLEG): container finished" podID="c457c92b-cc50-4e99-bf06-81d1b6cf3a7c" containerID="fa1c488c26c5961e60ec3cafe487ff7fdc7054a9dda307611a836a3ba3a14ad6" exitCode=0 Dec 02 09:43:29 crc kubenswrapper[4781]: I1202 09:43:29.667405 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-21eb-account-create-update-7w4wm" event={"ID":"c457c92b-cc50-4e99-bf06-81d1b6cf3a7c","Type":"ContainerDied","Data":"fa1c488c26c5961e60ec3cafe487ff7fdc7054a9dda307611a836a3ba3a14ad6"} Dec 02 09:43:29 crc kubenswrapper[4781]: I1202 09:43:29.669727 4781 generic.go:334] "Generic (PLEG): container finished" podID="f7ff12ef-2f31-42e6-b5e0-8b3172ac738b" containerID="908f120c3659c413a9d78d9d012afda775dc13af53b354e176171cfbbf0b676d" exitCode=0 Dec 02 09:43:29 crc kubenswrapper[4781]: I1202 09:43:29.669766 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3506-account-create-update-qrvct" event={"ID":"f7ff12ef-2f31-42e6-b5e0-8b3172ac738b","Type":"ContainerDied","Data":"908f120c3659c413a9d78d9d012afda775dc13af53b354e176171cfbbf0b676d"} Dec 02 09:43:29 crc kubenswrapper[4781]: E1202 09:43:29.684079 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"object-server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-replicator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-auditor\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-updater\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"rsync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"swift-recon-cron\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\"]" pod="openstack/swift-storage-0" podUID="d8cf953d-c4c9-457e-956c-d2942b56499b" Dec 02 09:43:30 crc kubenswrapper[4781]: I1202 09:43:30.412577 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:43:30 crc kubenswrapper[4781]: I1202 09:43:30.412670 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.012097 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-21eb-account-create-update-7w4wm" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.116659 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3506-account-create-update-qrvct" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.125764 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l8cvh" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.161743 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-78xmv" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.185849 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvm5\" (UniqueName: \"kubernetes.io/projected/c457c92b-cc50-4e99-bf06-81d1b6cf3a7c-kube-api-access-ngvm5\") pod \"c457c92b-cc50-4e99-bf06-81d1b6cf3a7c\" (UID: \"c457c92b-cc50-4e99-bf06-81d1b6cf3a7c\") " Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.186121 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c457c92b-cc50-4e99-bf06-81d1b6cf3a7c-operator-scripts\") pod \"c457c92b-cc50-4e99-bf06-81d1b6cf3a7c\" (UID: \"c457c92b-cc50-4e99-bf06-81d1b6cf3a7c\") " Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.203574 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c457c92b-cc50-4e99-bf06-81d1b6cf3a7c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c457c92b-cc50-4e99-bf06-81d1b6cf3a7c" (UID: "c457c92b-cc50-4e99-bf06-81d1b6cf3a7c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.208170 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c457c92b-cc50-4e99-bf06-81d1b6cf3a7c-kube-api-access-ngvm5" (OuterVolumeSpecName: "kube-api-access-ngvm5") pod "c457c92b-cc50-4e99-bf06-81d1b6cf3a7c" (UID: "c457c92b-cc50-4e99-bf06-81d1b6cf3a7c"). InnerVolumeSpecName "kube-api-access-ngvm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.237416 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6tfv2" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.248472 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a8e2-account-create-update-g5tjv" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.287393 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtjtq\" (UniqueName: \"kubernetes.io/projected/158568be-9a60-4a86-b480-e5c90939ed09-kube-api-access-dtjtq\") pod \"158568be-9a60-4a86-b480-e5c90939ed09\" (UID: \"158568be-9a60-4a86-b480-e5c90939ed09\") " Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.287435 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7ff12ef-2f31-42e6-b5e0-8b3172ac738b-operator-scripts\") pod \"f7ff12ef-2f31-42e6-b5e0-8b3172ac738b\" (UID: \"f7ff12ef-2f31-42e6-b5e0-8b3172ac738b\") " Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.287459 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5tld\" (UniqueName: \"kubernetes.io/projected/f7ff12ef-2f31-42e6-b5e0-8b3172ac738b-kube-api-access-d5tld\") pod \"f7ff12ef-2f31-42e6-b5e0-8b3172ac738b\" (UID: \"f7ff12ef-2f31-42e6-b5e0-8b3172ac738b\") " Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.287550 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/158568be-9a60-4a86-b480-e5c90939ed09-operator-scripts\") pod \"158568be-9a60-4a86-b480-e5c90939ed09\" (UID: \"158568be-9a60-4a86-b480-e5c90939ed09\") " Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.287617 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8b40bb3-6c5d-47d5-9e15-439119be130a-operator-scripts\") pod \"d8b40bb3-6c5d-47d5-9e15-439119be130a\" (UID: \"d8b40bb3-6c5d-47d5-9e15-439119be130a\") " Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.287661 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfxfp\" (UniqueName: \"kubernetes.io/projected/d8b40bb3-6c5d-47d5-9e15-439119be130a-kube-api-access-xfxfp\") pod \"d8b40bb3-6c5d-47d5-9e15-439119be130a\" (UID: \"d8b40bb3-6c5d-47d5-9e15-439119be130a\") " Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.287957 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvm5\" (UniqueName: \"kubernetes.io/projected/c457c92b-cc50-4e99-bf06-81d1b6cf3a7c-kube-api-access-ngvm5\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.287972 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c457c92b-cc50-4e99-bf06-81d1b6cf3a7c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.292859 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/158568be-9a60-4a86-b480-e5c90939ed09-kube-api-access-dtjtq" (OuterVolumeSpecName: "kube-api-access-dtjtq") pod "158568be-9a60-4a86-b480-e5c90939ed09" (UID: "158568be-9a60-4a86-b480-e5c90939ed09"). InnerVolumeSpecName "kube-api-access-dtjtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.293484 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158568be-9a60-4a86-b480-e5c90939ed09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "158568be-9a60-4a86-b480-e5c90939ed09" (UID: "158568be-9a60-4a86-b480-e5c90939ed09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.294037 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8b40bb3-6c5d-47d5-9e15-439119be130a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8b40bb3-6c5d-47d5-9e15-439119be130a" (UID: "d8b40bb3-6c5d-47d5-9e15-439119be130a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.294527 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7ff12ef-2f31-42e6-b5e0-8b3172ac738b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f7ff12ef-2f31-42e6-b5e0-8b3172ac738b" (UID: "f7ff12ef-2f31-42e6-b5e0-8b3172ac738b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.310933 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8b40bb3-6c5d-47d5-9e15-439119be130a-kube-api-access-xfxfp" (OuterVolumeSpecName: "kube-api-access-xfxfp") pod "d8b40bb3-6c5d-47d5-9e15-439119be130a" (UID: "d8b40bb3-6c5d-47d5-9e15-439119be130a"). InnerVolumeSpecName "kube-api-access-xfxfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.317879 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ff12ef-2f31-42e6-b5e0-8b3172ac738b-kube-api-access-d5tld" (OuterVolumeSpecName: "kube-api-access-d5tld") pod "f7ff12ef-2f31-42e6-b5e0-8b3172ac738b" (UID: "f7ff12ef-2f31-42e6-b5e0-8b3172ac738b"). InnerVolumeSpecName "kube-api-access-d5tld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.389633 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae522302-ae88-485a-a767-d6a6c6bf5205-operator-scripts\") pod \"ae522302-ae88-485a-a767-d6a6c6bf5205\" (UID: \"ae522302-ae88-485a-a767-d6a6c6bf5205\") " Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.389735 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckdz4\" (UniqueName: \"kubernetes.io/projected/ae522302-ae88-485a-a767-d6a6c6bf5205-kube-api-access-ckdz4\") pod \"ae522302-ae88-485a-a767-d6a6c6bf5205\" (UID: \"ae522302-ae88-485a-a767-d6a6c6bf5205\") " Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.389762 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhq5k\" (UniqueName: \"kubernetes.io/projected/8118db55-54ba-4cf2-b80d-266872f87896-kube-api-access-xhq5k\") pod \"8118db55-54ba-4cf2-b80d-266872f87896\" (UID: \"8118db55-54ba-4cf2-b80d-266872f87896\") " Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.390569 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8118db55-54ba-4cf2-b80d-266872f87896-operator-scripts\") pod \"8118db55-54ba-4cf2-b80d-266872f87896\" (UID: \"8118db55-54ba-4cf2-b80d-266872f87896\") " Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.391160 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/158568be-9a60-4a86-b480-e5c90939ed09-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.391189 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8b40bb3-6c5d-47d5-9e15-439119be130a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.391203 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfxfp\" (UniqueName: \"kubernetes.io/projected/d8b40bb3-6c5d-47d5-9e15-439119be130a-kube-api-access-xfxfp\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.391225 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtjtq\" (UniqueName: \"kubernetes.io/projected/158568be-9a60-4a86-b480-e5c90939ed09-kube-api-access-dtjtq\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.391237 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7ff12ef-2f31-42e6-b5e0-8b3172ac738b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.391248 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5tld\" (UniqueName: \"kubernetes.io/projected/f7ff12ef-2f31-42e6-b5e0-8b3172ac738b-kube-api-access-d5tld\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.392139 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8118db55-54ba-4cf2-b80d-266872f87896-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8118db55-54ba-4cf2-b80d-266872f87896" (UID: "8118db55-54ba-4cf2-b80d-266872f87896"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.394965 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae522302-ae88-485a-a767-d6a6c6bf5205-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ae522302-ae88-485a-a767-d6a6c6bf5205" (UID: "ae522302-ae88-485a-a767-d6a6c6bf5205"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.395990 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8118db55-54ba-4cf2-b80d-266872f87896-kube-api-access-xhq5k" (OuterVolumeSpecName: "kube-api-access-xhq5k") pod "8118db55-54ba-4cf2-b80d-266872f87896" (UID: "8118db55-54ba-4cf2-b80d-266872f87896"). InnerVolumeSpecName "kube-api-access-xhq5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.404355 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae522302-ae88-485a-a767-d6a6c6bf5205-kube-api-access-ckdz4" (OuterVolumeSpecName: "kube-api-access-ckdz4") pod "ae522302-ae88-485a-a767-d6a6c6bf5205" (UID: "ae522302-ae88-485a-a767-d6a6c6bf5205"). InnerVolumeSpecName "kube-api-access-ckdz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.493843 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckdz4\" (UniqueName: \"kubernetes.io/projected/ae522302-ae88-485a-a767-d6a6c6bf5205-kube-api-access-ckdz4\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.494416 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhq5k\" (UniqueName: \"kubernetes.io/projected/8118db55-54ba-4cf2-b80d-266872f87896-kube-api-access-xhq5k\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.494430 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8118db55-54ba-4cf2-b80d-266872f87896-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.494440 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae522302-ae88-485a-a767-d6a6c6bf5205-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.724272 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-78xmv" event={"ID":"158568be-9a60-4a86-b480-e5c90939ed09","Type":"ContainerDied","Data":"b6b40497cdb771fc339a499ca40c600771e662088b7bcc1dcf0d3c748244a136"} Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.724342 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6b40497cdb771fc339a499ca40c600771e662088b7bcc1dcf0d3c748244a136" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.724282 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-78xmv" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.726257 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l8cvh" event={"ID":"d8b40bb3-6c5d-47d5-9e15-439119be130a","Type":"ContainerDied","Data":"51959b9bcaa0a0c9709c8833956afd3a08352d5e3cf887ee6d3e541c0b3eb697"} Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.726387 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51959b9bcaa0a0c9709c8833956afd3a08352d5e3cf887ee6d3e541c0b3eb697" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.726342 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l8cvh" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.741067 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5mr6j" event={"ID":"2075ad57-12d9-47c4-80ba-5bf9e1bce693","Type":"ContainerStarted","Data":"ef77710f70f761d7af1694d2e262bc01b7d43819a9686336064d20e7b0d27d2a"} Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.743384 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a8e2-account-create-update-g5tjv" event={"ID":"ae522302-ae88-485a-a767-d6a6c6bf5205","Type":"ContainerDied","Data":"807289327bf2619ea733025947105abb79197b373158f5544635763f6f19a8c8"} Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.743433 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="807289327bf2619ea733025947105abb79197b373158f5544635763f6f19a8c8" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.743398 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a8e2-account-create-update-g5tjv" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.746694 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-21eb-account-create-update-7w4wm" event={"ID":"c457c92b-cc50-4e99-bf06-81d1b6cf3a7c","Type":"ContainerDied","Data":"37868771e3ffe099212a5344c0b2900d259e87517b1b67c2051daf6e0375caa8"} Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.746734 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37868771e3ffe099212a5344c0b2900d259e87517b1b67c2051daf6e0375caa8" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.746705 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-21eb-account-create-update-7w4wm" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.748836 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3506-account-create-update-qrvct" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.748830 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3506-account-create-update-qrvct" event={"ID":"f7ff12ef-2f31-42e6-b5e0-8b3172ac738b","Type":"ContainerDied","Data":"bc448207c7188927ee72c8aefb0207ab9032d3e8951803a8782961158633627d"} Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.749102 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc448207c7188927ee72c8aefb0207ab9032d3e8951803a8782961158633627d" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.751988 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6tfv2" event={"ID":"8118db55-54ba-4cf2-b80d-266872f87896","Type":"ContainerDied","Data":"1c48f5df3123c4dae8f38dcca96cd232a12b282f2df17ae94179b3ab45951b9b"} Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.752020 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c48f5df3123c4dae8f38dcca96cd232a12b282f2df17ae94179b3ab45951b9b" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.752080 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6tfv2" Dec 02 09:43:34 crc kubenswrapper[4781]: I1202 09:43:34.766061 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-5mr6j" podStartSLOduration=7.144557372 podStartE2EDuration="13.766037714s" podCreationTimestamp="2025-12-02 09:43:21 +0000 UTC" firstStartedPulling="2025-12-02 09:43:27.383083588 +0000 UTC m=+1370.206957467" lastFinishedPulling="2025-12-02 09:43:34.00456393 +0000 UTC m=+1376.828437809" observedRunningTime="2025-12-02 09:43:34.762755456 +0000 UTC m=+1377.586629335" watchObservedRunningTime="2025-12-02 09:43:34.766037714 +0000 UTC m=+1377.589911593" Dec 02 09:43:41 crc kubenswrapper[4781]: I1202 09:43:41.811735 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-t6bqc" event={"ID":"ea3e7049-869e-4822-a11e-9eb2df6e3eeb","Type":"ContainerStarted","Data":"4edc7af81473d46dc2e001614f6757ba9772e247b0e410a508c19e71f50f260e"} Dec 02 09:43:41 crc kubenswrapper[4781]: I1202 09:43:41.813454 4781 generic.go:334] "Generic (PLEG): container finished" podID="2075ad57-12d9-47c4-80ba-5bf9e1bce693" containerID="ef77710f70f761d7af1694d2e262bc01b7d43819a9686336064d20e7b0d27d2a" exitCode=0 Dec 02 09:43:41 crc kubenswrapper[4781]: I1202 09:43:41.813480 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5mr6j" event={"ID":"2075ad57-12d9-47c4-80ba-5bf9e1bce693","Type":"ContainerDied","Data":"ef77710f70f761d7af1694d2e262bc01b7d43819a9686336064d20e7b0d27d2a"} Dec 02 09:43:41 crc kubenswrapper[4781]: I1202 09:43:41.837290 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-t6bqc" podStartSLOduration=1.978594132 podStartE2EDuration="42.837269829s" podCreationTimestamp="2025-12-02 09:42:59 +0000 UTC" firstStartedPulling="2025-12-02 09:43:00.195852312 +0000 UTC m=+1343.019726191" lastFinishedPulling="2025-12-02 09:43:41.054528009 +0000 UTC m=+1383.878401888" observedRunningTime="2025-12-02 09:43:41.831236628 +0000 UTC m=+1384.655110507" watchObservedRunningTime="2025-12-02 09:43:41.837269829 +0000 UTC m=+1384.661143708" Dec 02 09:43:42 crc kubenswrapper[4781]: I1202 09:43:42.830973 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d8cf953d-c4c9-457e-956c-d2942b56499b","Type":"ContainerStarted","Data":"aa7667db7018fe4749bf5dd3d01fd5f70d743cfa152f69d7f452d50b775f7907"} Dec 02 09:43:42 crc kubenswrapper[4781]: I1202 09:43:42.831310 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d8cf953d-c4c9-457e-956c-d2942b56499b","Type":"ContainerStarted","Data":"377dce0319f17d08cdb9c782100f507846f470d4f3352f19a96436e2c13dae1f"} Dec 02 09:43:43 crc kubenswrapper[4781]: I1202 09:43:43.132022 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5mr6j" Dec 02 09:43:43 crc kubenswrapper[4781]: I1202 09:43:43.234114 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fckts\" (UniqueName: \"kubernetes.io/projected/2075ad57-12d9-47c4-80ba-5bf9e1bce693-kube-api-access-fckts\") pod \"2075ad57-12d9-47c4-80ba-5bf9e1bce693\" (UID: \"2075ad57-12d9-47c4-80ba-5bf9e1bce693\") " Dec 02 09:43:43 crc kubenswrapper[4781]: I1202 09:43:43.234219 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2075ad57-12d9-47c4-80ba-5bf9e1bce693-config-data\") pod \"2075ad57-12d9-47c4-80ba-5bf9e1bce693\" (UID: \"2075ad57-12d9-47c4-80ba-5bf9e1bce693\") " Dec 02 09:43:43 crc kubenswrapper[4781]: I1202 09:43:43.234281 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2075ad57-12d9-47c4-80ba-5bf9e1bce693-combined-ca-bundle\") pod \"2075ad57-12d9-47c4-80ba-5bf9e1bce693\" (UID: \"2075ad57-12d9-47c4-80ba-5bf9e1bce693\") " Dec 02 09:43:43 crc kubenswrapper[4781]: I1202 09:43:43.239350 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2075ad57-12d9-47c4-80ba-5bf9e1bce693-kube-api-access-fckts" (OuterVolumeSpecName: "kube-api-access-fckts") pod "2075ad57-12d9-47c4-80ba-5bf9e1bce693" (UID: "2075ad57-12d9-47c4-80ba-5bf9e1bce693"). InnerVolumeSpecName "kube-api-access-fckts". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:43:43 crc kubenswrapper[4781]: I1202 09:43:43.259775 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2075ad57-12d9-47c4-80ba-5bf9e1bce693-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2075ad57-12d9-47c4-80ba-5bf9e1bce693" (UID: "2075ad57-12d9-47c4-80ba-5bf9e1bce693"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:43:43 crc kubenswrapper[4781]: I1202 09:43:43.283069 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2075ad57-12d9-47c4-80ba-5bf9e1bce693-config-data" (OuterVolumeSpecName: "config-data") pod "2075ad57-12d9-47c4-80ba-5bf9e1bce693" (UID: "2075ad57-12d9-47c4-80ba-5bf9e1bce693"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:43:43 crc kubenswrapper[4781]: I1202 09:43:43.335873 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fckts\" (UniqueName: \"kubernetes.io/projected/2075ad57-12d9-47c4-80ba-5bf9e1bce693-kube-api-access-fckts\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:43 crc kubenswrapper[4781]: I1202 09:43:43.335934 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2075ad57-12d9-47c4-80ba-5bf9e1bce693-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:43 crc kubenswrapper[4781]: I1202 09:43:43.335950 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2075ad57-12d9-47c4-80ba-5bf9e1bce693-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:43 crc kubenswrapper[4781]: I1202 09:43:43.851591 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5mr6j" event={"ID":"2075ad57-12d9-47c4-80ba-5bf9e1bce693","Type":"ContainerDied","Data":"e1adfc3059cf325130aa3a214ccc3edb5d273cdee9c25f1dccc5cfc9e863a082"} Dec 02 09:43:43 crc kubenswrapper[4781]: I1202 09:43:43.851880 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1adfc3059cf325130aa3a214ccc3edb5d273cdee9c25f1dccc5cfc9e863a082" Dec 02 09:43:43 crc kubenswrapper[4781]: I1202 09:43:43.851960 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5mr6j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.050503 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-hp26d"] Dec 02 09:43:44 crc kubenswrapper[4781]: E1202 09:43:44.050902 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b40bb3-6c5d-47d5-9e15-439119be130a" containerName="mariadb-database-create" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.050946 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b40bb3-6c5d-47d5-9e15-439119be130a" containerName="mariadb-database-create" Dec 02 09:43:44 crc kubenswrapper[4781]: E1202 09:43:44.050969 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158568be-9a60-4a86-b480-e5c90939ed09" containerName="mariadb-database-create" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.050977 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="158568be-9a60-4a86-b480-e5c90939ed09" containerName="mariadb-database-create" Dec 02 09:43:44 crc kubenswrapper[4781]: E1202 09:43:44.051000 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8118db55-54ba-4cf2-b80d-266872f87896" containerName="mariadb-database-create" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.051009 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8118db55-54ba-4cf2-b80d-266872f87896" containerName="mariadb-database-create" Dec 02 09:43:44 crc kubenswrapper[4781]: E1202 09:43:44.051023 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ff12ef-2f31-42e6-b5e0-8b3172ac738b" containerName="mariadb-account-create-update" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.051030 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ff12ef-2f31-42e6-b5e0-8b3172ac738b" containerName="mariadb-account-create-update" Dec 02 09:43:44 crc kubenswrapper[4781]: E1202 09:43:44.051046 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae522302-ae88-485a-a767-d6a6c6bf5205" containerName="mariadb-account-create-update" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.051053 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae522302-ae88-485a-a767-d6a6c6bf5205" containerName="mariadb-account-create-update" Dec 02 09:43:44 crc kubenswrapper[4781]: E1202 09:43:44.051071 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c457c92b-cc50-4e99-bf06-81d1b6cf3a7c" containerName="mariadb-account-create-update" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.051079 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c457c92b-cc50-4e99-bf06-81d1b6cf3a7c" containerName="mariadb-account-create-update" Dec 02 09:43:44 crc kubenswrapper[4781]: E1202 09:43:44.051095 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2075ad57-12d9-47c4-80ba-5bf9e1bce693" containerName="keystone-db-sync" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.051104 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2075ad57-12d9-47c4-80ba-5bf9e1bce693" containerName="keystone-db-sync" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.051297 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b40bb3-6c5d-47d5-9e15-439119be130a" containerName="mariadb-database-create" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.051314 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c457c92b-cc50-4e99-bf06-81d1b6cf3a7c" containerName="mariadb-account-create-update" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.051329 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8118db55-54ba-4cf2-b80d-266872f87896" containerName="mariadb-database-create" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.051339 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae522302-ae88-485a-a767-d6a6c6bf5205" containerName="mariadb-account-create-update" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.051359 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="158568be-9a60-4a86-b480-e5c90939ed09" containerName="mariadb-database-create" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.051367 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2075ad57-12d9-47c4-80ba-5bf9e1bce693" containerName="keystone-db-sync" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.051379 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ff12ef-2f31-42e6-b5e0-8b3172ac738b" containerName="mariadb-account-create-update" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.052019 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.057601 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q5lzd" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.063404 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.063607 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.063706 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.063838 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.107070 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hp26d"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.148215 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2nsb\" (UniqueName: \"kubernetes.io/projected/ce4667b5-2519-4d80-9357-c54ce473d4ec-kube-api-access-c2nsb\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.148288 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-config-data\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.148373 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-fernet-keys\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.148421 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-s662m"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.148466 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-scripts\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.148646 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-combined-ca-bundle\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.148775 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-credential-keys\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.153355 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.267872 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-dns-svc\") pod \"dnsmasq-dns-f877ddd87-s662m\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.267939 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-combined-ca-bundle\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.267969 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l85b\" (UniqueName: \"kubernetes.io/projected/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-kube-api-access-9l85b\") pod \"dnsmasq-dns-f877ddd87-s662m\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.268019 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-s662m\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.268042 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-credential-keys\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.268105 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-config\") pod \"dnsmasq-dns-f877ddd87-s662m\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.268133 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2nsb\" (UniqueName: \"kubernetes.io/projected/ce4667b5-2519-4d80-9357-c54ce473d4ec-kube-api-access-c2nsb\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.268152 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-config-data\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.268176 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-fernet-keys\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.268195 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-scripts\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.268223 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-s662m\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.276638 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-s662m"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.277537 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-config-data\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.285774 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-scripts\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.295797 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-combined-ca-bundle\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.297130 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-fernet-keys\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.317856 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-55c985789-sc66d"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.334391 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55c985789-sc66d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.346626 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-credential-keys\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.347083 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2nsb\" (UniqueName: \"kubernetes.io/projected/ce4667b5-2519-4d80-9357-c54ce473d4ec-kube-api-access-c2nsb\") pod \"keystone-bootstrap-hp26d\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.363457 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55c985789-sc66d"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.367020 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.367226 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.367445 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.367688 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-xst5d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.377463 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xv8c\" (UniqueName: \"kubernetes.io/projected/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-kube-api-access-4xv8c\") pod \"horizon-55c985789-sc66d\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " pod="openstack/horizon-55c985789-sc66d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.377677 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-s662m\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.377830 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-scripts\") pod \"horizon-55c985789-sc66d\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " pod="openstack/horizon-55c985789-sc66d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.377949 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-config-data\") pod \"horizon-55c985789-sc66d\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " pod="openstack/horizon-55c985789-sc66d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.378083 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-config\") pod \"dnsmasq-dns-f877ddd87-s662m\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.378224 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-s662m\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.378386 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-dns-svc\") pod \"dnsmasq-dns-f877ddd87-s662m\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.378489 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l85b\" (UniqueName: \"kubernetes.io/projected/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-kube-api-access-9l85b\") pod \"dnsmasq-dns-f877ddd87-s662m\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.378599 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-logs\") pod \"horizon-55c985789-sc66d\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " pod="openstack/horizon-55c985789-sc66d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.378703 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-horizon-secret-key\") pod \"horizon-55c985789-sc66d\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " pod="openstack/horizon-55c985789-sc66d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.380733 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-config\") pod \"dnsmasq-dns-f877ddd87-s662m\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.386565 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-s662m\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.386992 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-s662m\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.387507 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.409435 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-dns-svc\") pod \"dnsmasq-dns-f877ddd87-s662m\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.450009 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-mfnct"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.451308 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mfnct" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.456522 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l85b\" (UniqueName: \"kubernetes.io/projected/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-kube-api-access-9l85b\") pod \"dnsmasq-dns-f877ddd87-s662m\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.476831 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.477029 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ln2qp" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.477151 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.486402 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-scripts\") pod \"horizon-55c985789-sc66d\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " pod="openstack/horizon-55c985789-sc66d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.486449 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-config-data\") pod \"horizon-55c985789-sc66d\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " pod="openstack/horizon-55c985789-sc66d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.486625 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e968b8-6eb0-41d7-beed-8b2bf7006359-combined-ca-bundle\") pod \"neutron-db-sync-mfnct\" (UID: \"c8e968b8-6eb0-41d7-beed-8b2bf7006359\") " pod="openstack/neutron-db-sync-mfnct" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.486654 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8e968b8-6eb0-41d7-beed-8b2bf7006359-config\") pod \"neutron-db-sync-mfnct\" (UID: \"c8e968b8-6eb0-41d7-beed-8b2bf7006359\") " pod="openstack/neutron-db-sync-mfnct" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.486679 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-logs\") pod \"horizon-55c985789-sc66d\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " pod="openstack/horizon-55c985789-sc66d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.486703 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-horizon-secret-key\") pod \"horizon-55c985789-sc66d\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " pod="openstack/horizon-55c985789-sc66d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.486766 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xv8c\" (UniqueName: \"kubernetes.io/projected/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-kube-api-access-4xv8c\") pod \"horizon-55c985789-sc66d\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " pod="openstack/horizon-55c985789-sc66d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.486795 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgk85\" (UniqueName: \"kubernetes.io/projected/c8e968b8-6eb0-41d7-beed-8b2bf7006359-kube-api-access-wgk85\") pod \"neutron-db-sync-mfnct\" (UID: \"c8e968b8-6eb0-41d7-beed-8b2bf7006359\") " pod="openstack/neutron-db-sync-mfnct" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.487586 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-scripts\") pod \"horizon-55c985789-sc66d\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " pod="openstack/horizon-55c985789-sc66d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.488501 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-config-data\") pod \"horizon-55c985789-sc66d\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " pod="openstack/horizon-55c985789-sc66d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.489282 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-logs\") pod \"horizon-55c985789-sc66d\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " pod="openstack/horizon-55c985789-sc66d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.497595 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-horizon-secret-key\") pod \"horizon-55c985789-sc66d\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " pod="openstack/horizon-55c985789-sc66d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.516621 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.522491 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mfnct"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.547552 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xv8c\" (UniqueName: \"kubernetes.io/projected/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-kube-api-access-4xv8c\") pod \"horizon-55c985789-sc66d\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " pod="openstack/horizon-55c985789-sc66d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.584682 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-m7kpn"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.587689 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-m7kpn" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.593131 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgk85\" (UniqueName: \"kubernetes.io/projected/c8e968b8-6eb0-41d7-beed-8b2bf7006359-kube-api-access-wgk85\") pod \"neutron-db-sync-mfnct\" (UID: \"c8e968b8-6eb0-41d7-beed-8b2bf7006359\") " pod="openstack/neutron-db-sync-mfnct" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.593414 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e968b8-6eb0-41d7-beed-8b2bf7006359-combined-ca-bundle\") pod \"neutron-db-sync-mfnct\" (UID: \"c8e968b8-6eb0-41d7-beed-8b2bf7006359\") " pod="openstack/neutron-db-sync-mfnct" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.593479 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8e968b8-6eb0-41d7-beed-8b2bf7006359-config\") pod \"neutron-db-sync-mfnct\" (UID: \"c8e968b8-6eb0-41d7-beed-8b2bf7006359\") " pod="openstack/neutron-db-sync-mfnct" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.598023 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.598243 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.598392 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8p898" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.609727 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8e968b8-6eb0-41d7-beed-8b2bf7006359-config\") pod \"neutron-db-sync-mfnct\" (UID: \"c8e968b8-6eb0-41d7-beed-8b2bf7006359\") " pod="openstack/neutron-db-sync-mfnct" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.614309 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.614904 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e968b8-6eb0-41d7-beed-8b2bf7006359-combined-ca-bundle\") pod \"neutron-db-sync-mfnct\" (UID: \"c8e968b8-6eb0-41d7-beed-8b2bf7006359\") " pod="openstack/neutron-db-sync-mfnct" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.620743 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.635192 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.635840 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.653853 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-s662m"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.663022 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgk85\" (UniqueName: \"kubernetes.io/projected/c8e968b8-6eb0-41d7-beed-8b2bf7006359-kube-api-access-wgk85\") pod \"neutron-db-sync-mfnct\" (UID: \"c8e968b8-6eb0-41d7-beed-8b2bf7006359\") " pod="openstack/neutron-db-sync-mfnct" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.677458 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-m7kpn"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.689576 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.708914 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.708979 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-config-data\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.709037 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c98b4bc6-c086-4663-b794-92a36b0da2ba-run-httpd\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.709068 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-scripts\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.709098 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8tv4\" (UniqueName: \"kubernetes.io/projected/c98b4bc6-c086-4663-b794-92a36b0da2ba-kube-api-access-f8tv4\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.709173 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c98b4bc6-c086-4663-b794-92a36b0da2ba-log-httpd\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.709199 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-config-data\") pod \"placement-db-sync-m7kpn\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " pod="openstack/placement-db-sync-m7kpn" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.709217 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-combined-ca-bundle\") pod \"placement-db-sync-m7kpn\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " pod="openstack/placement-db-sync-m7kpn" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.709270 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4hqp\" (UniqueName: \"kubernetes.io/projected/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-kube-api-access-j4hqp\") pod \"placement-db-sync-m7kpn\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " pod="openstack/placement-db-sync-m7kpn" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.709350 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-scripts\") pod \"placement-db-sync-m7kpn\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " pod="openstack/placement-db-sync-m7kpn" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.709390 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.709465 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-logs\") pod \"placement-db-sync-m7kpn\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " pod="openstack/placement-db-sync-m7kpn" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.753887 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-bq87j"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.756197 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.766507 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.767759 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bq87j"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.776341 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.778002 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-w4npc" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.795361 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55c985789-sc66d" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814216 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-scripts\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814296 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9418aedd-84eb-4ff9-89f8-831695e5471e-etc-machine-id\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814348 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4xvv\" (UniqueName: \"kubernetes.io/projected/9418aedd-84eb-4ff9-89f8-831695e5471e-kube-api-access-h4xvv\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814378 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814393 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-config-data\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814409 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-config-data\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814455 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c98b4bc6-c086-4663-b794-92a36b0da2ba-run-httpd\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814472 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-scripts\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814502 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8tv4\" (UniqueName: \"kubernetes.io/projected/c98b4bc6-c086-4663-b794-92a36b0da2ba-kube-api-access-f8tv4\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814536 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-combined-ca-bundle\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814587 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c98b4bc6-c086-4663-b794-92a36b0da2ba-log-httpd\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814603 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-config-data\") pod \"placement-db-sync-m7kpn\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " pod="openstack/placement-db-sync-m7kpn" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814619 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-combined-ca-bundle\") pod \"placement-db-sync-m7kpn\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " pod="openstack/placement-db-sync-m7kpn" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814652 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-db-sync-config-data\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814671 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4hqp\" (UniqueName: \"kubernetes.io/projected/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-kube-api-access-j4hqp\") pod \"placement-db-sync-m7kpn\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " pod="openstack/placement-db-sync-m7kpn" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814710 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-scripts\") pod \"placement-db-sync-m7kpn\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " pod="openstack/placement-db-sync-m7kpn" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814746 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.814781 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-logs\") pod \"placement-db-sync-m7kpn\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " pod="openstack/placement-db-sync-m7kpn" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.815308 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-logs\") pod \"placement-db-sync-m7kpn\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " pod="openstack/placement-db-sync-m7kpn" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.816093 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c98b4bc6-c086-4663-b794-92a36b0da2ba-log-httpd\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.821486 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-config-data\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.822573 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c98b4bc6-c086-4663-b794-92a36b0da2ba-run-httpd\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.826565 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-6cdh4"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.828098 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mfnct" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.836659 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-scripts\") pod \"placement-db-sync-m7kpn\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " pod="openstack/placement-db-sync-m7kpn" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.837762 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.841345 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.843214 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-scripts\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.843845 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-config-data\") pod \"placement-db-sync-m7kpn\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " pod="openstack/placement-db-sync-m7kpn" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.850398 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.850466 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-6cdh4"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.865916 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-combined-ca-bundle\") pod \"placement-db-sync-m7kpn\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " pod="openstack/placement-db-sync-m7kpn" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.875793 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-br5j5"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.877479 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-br5j5" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.882830 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-br5j5"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.885690 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8phcb" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.885894 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.897411 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8tv4\" (UniqueName: \"kubernetes.io/projected/c98b4bc6-c086-4663-b794-92a36b0da2ba-kube-api-access-f8tv4\") pod \"ceilometer-0\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " pod="openstack/ceilometer-0" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.912580 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4hqp\" (UniqueName: \"kubernetes.io/projected/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-kube-api-access-j4hqp\") pod \"placement-db-sync-m7kpn\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " pod="openstack/placement-db-sync-m7kpn" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.915883 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-db-sync-config-data\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.915979 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-6cdh4\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.916017 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rq2b\" (UniqueName: \"kubernetes.io/projected/39e827e8-c246-4099-8dd5-edede022aa47-kube-api-access-2rq2b\") pod \"dnsmasq-dns-68dcc9cf6f-6cdh4\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.916036 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-6cdh4\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.916102 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-scripts\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.916134 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-config\") pod \"dnsmasq-dns-68dcc9cf6f-6cdh4\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.916158 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9418aedd-84eb-4ff9-89f8-831695e5471e-etc-machine-id\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.916179 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-6cdh4\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.916225 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4xvv\" (UniqueName: \"kubernetes.io/projected/9418aedd-84eb-4ff9-89f8-831695e5471e-kube-api-access-h4xvv\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.916250 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-config-data\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.916304 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-combined-ca-bundle\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.916345 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-78d5dd47fc-gnfbl"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.917764 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.918848 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9418aedd-84eb-4ff9-89f8-831695e5471e-etc-machine-id\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.937970 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-config-data\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.952867 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-db-sync-config-data\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.960328 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-combined-ca-bundle\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.960417 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78d5dd47fc-gnfbl"] Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.961128 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-scripts\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.969978 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4xvv\" (UniqueName: \"kubernetes.io/projected/9418aedd-84eb-4ff9-89f8-831695e5471e-kube-api-access-h4xvv\") pod \"cinder-db-sync-bq87j\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.972472 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d8cf953d-c4c9-457e-956c-d2942b56499b","Type":"ContainerStarted","Data":"ae0f405f0b88b1d04190824b24da91a40aa4b40dfe4d3036d7f2a9df85d74b16"} Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.972517 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d8cf953d-c4c9-457e-956c-d2942b56499b","Type":"ContainerStarted","Data":"1a30b47a7a84ab36668318e378b603f6930f8ee2c0261f3b06a6701053f9d5ea"} Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.972526 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d8cf953d-c4c9-457e-956c-d2942b56499b","Type":"ContainerStarted","Data":"7b9aa93f3b288e381acce304748105196ddd6a110319c85f073897eefc722899"} Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.979194 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-m7kpn" Dec 02 09:43:44 crc kubenswrapper[4781]: I1202 09:43:44.993606 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.024477 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-config\") pod \"dnsmasq-dns-68dcc9cf6f-6cdh4\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.024533 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e769104b-928b-4610-851f-edf65d3071be-scripts\") pod \"horizon-78d5dd47fc-gnfbl\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.024568 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-6cdh4\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.024612 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq9zt\" (UniqueName: \"kubernetes.io/projected/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-kube-api-access-vq9zt\") pod \"barbican-db-sync-br5j5\" (UID: \"51f721e2-e8fd-4ae1-89d9-fe7272e8246e\") " pod="openstack/barbican-db-sync-br5j5" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.024651 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr88t\" (UniqueName: \"kubernetes.io/projected/e769104b-928b-4610-851f-edf65d3071be-kube-api-access-vr88t\") pod \"horizon-78d5dd47fc-gnfbl\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.024789 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-db-sync-config-data\") pod \"barbican-db-sync-br5j5\" (UID: \"51f721e2-e8fd-4ae1-89d9-fe7272e8246e\") " pod="openstack/barbican-db-sync-br5j5" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.024880 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-6cdh4\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.025014 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e769104b-928b-4610-851f-edf65d3071be-config-data\") pod \"horizon-78d5dd47fc-gnfbl\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.025046 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rq2b\" (UniqueName: \"kubernetes.io/projected/39e827e8-c246-4099-8dd5-edede022aa47-kube-api-access-2rq2b\") pod \"dnsmasq-dns-68dcc9cf6f-6cdh4\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.025068 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-6cdh4\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.025090 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-combined-ca-bundle\") pod \"barbican-db-sync-br5j5\" (UID: \"51f721e2-e8fd-4ae1-89d9-fe7272e8246e\") " pod="openstack/barbican-db-sync-br5j5" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.025149 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e769104b-928b-4610-851f-edf65d3071be-horizon-secret-key\") pod \"horizon-78d5dd47fc-gnfbl\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.025194 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e769104b-928b-4610-851f-edf65d3071be-logs\") pod \"horizon-78d5dd47fc-gnfbl\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.026283 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-config\") pod \"dnsmasq-dns-68dcc9cf6f-6cdh4\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.026910 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-6cdh4\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.029785 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-6cdh4\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.029824 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-6cdh4\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.057794 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rq2b\" (UniqueName: \"kubernetes.io/projected/39e827e8-c246-4099-8dd5-edede022aa47-kube-api-access-2rq2b\") pod \"dnsmasq-dns-68dcc9cf6f-6cdh4\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.101115 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bq87j" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.129331 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e769104b-928b-4610-851f-edf65d3071be-horizon-secret-key\") pod \"horizon-78d5dd47fc-gnfbl\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.129378 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e769104b-928b-4610-851f-edf65d3071be-logs\") pod \"horizon-78d5dd47fc-gnfbl\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.129417 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e769104b-928b-4610-851f-edf65d3071be-scripts\") pod \"horizon-78d5dd47fc-gnfbl\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.129450 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq9zt\" (UniqueName: \"kubernetes.io/projected/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-kube-api-access-vq9zt\") pod \"barbican-db-sync-br5j5\" (UID: \"51f721e2-e8fd-4ae1-89d9-fe7272e8246e\") " pod="openstack/barbican-db-sync-br5j5" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.129473 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr88t\" (UniqueName: \"kubernetes.io/projected/e769104b-928b-4610-851f-edf65d3071be-kube-api-access-vr88t\") pod \"horizon-78d5dd47fc-gnfbl\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.129524 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-db-sync-config-data\") pod \"barbican-db-sync-br5j5\" (UID: \"51f721e2-e8fd-4ae1-89d9-fe7272e8246e\") " pod="openstack/barbican-db-sync-br5j5" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.129570 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e769104b-928b-4610-851f-edf65d3071be-config-data\") pod \"horizon-78d5dd47fc-gnfbl\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.129589 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-combined-ca-bundle\") pod \"barbican-db-sync-br5j5\" (UID: \"51f721e2-e8fd-4ae1-89d9-fe7272e8246e\") " pod="openstack/barbican-db-sync-br5j5" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.136607 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e769104b-928b-4610-851f-edf65d3071be-scripts\") pod \"horizon-78d5dd47fc-gnfbl\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.136909 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e769104b-928b-4610-851f-edf65d3071be-config-data\") pod \"horizon-78d5dd47fc-gnfbl\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.137157 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e769104b-928b-4610-851f-edf65d3071be-logs\") pod \"horizon-78d5dd47fc-gnfbl\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.140720 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-combined-ca-bundle\") pod \"barbican-db-sync-br5j5\" (UID: \"51f721e2-e8fd-4ae1-89d9-fe7272e8246e\") " pod="openstack/barbican-db-sync-br5j5" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.144672 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-db-sync-config-data\") pod \"barbican-db-sync-br5j5\" (UID: \"51f721e2-e8fd-4ae1-89d9-fe7272e8246e\") " pod="openstack/barbican-db-sync-br5j5" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.145465 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e769104b-928b-4610-851f-edf65d3071be-horizon-secret-key\") pod \"horizon-78d5dd47fc-gnfbl\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.167703 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq9zt\" (UniqueName: \"kubernetes.io/projected/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-kube-api-access-vq9zt\") pod \"barbican-db-sync-br5j5\" (UID: \"51f721e2-e8fd-4ae1-89d9-fe7272e8246e\") " pod="openstack/barbican-db-sync-br5j5" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.174389 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr88t\" (UniqueName: \"kubernetes.io/projected/e769104b-928b-4610-851f-edf65d3071be-kube-api-access-vr88t\") pod \"horizon-78d5dd47fc-gnfbl\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.195306 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.220249 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-br5j5" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.240261 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hp26d"] Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.257417 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.638398 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55c985789-sc66d"] Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.656861 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mfnct"] Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.748279 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-s662m"] Dec 02 09:43:45 crc kubenswrapper[4781]: I1202 09:43:45.984462 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-s662m" event={"ID":"6e2bb3c1-1183-46ac-bb83-c552bfb7a874","Type":"ContainerStarted","Data":"99b10a647ae83815805e13fe56edcfa81c8612be5319193b3eed84af8ef26300"} Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.001781 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d8cf953d-c4c9-457e-956c-d2942b56499b","Type":"ContainerStarted","Data":"c6551012c41d77400a1fc59b81bd00e2d96ff9cb421a17221348fb39f783f0a4"} Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.004671 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mfnct" event={"ID":"c8e968b8-6eb0-41d7-beed-8b2bf7006359","Type":"ContainerStarted","Data":"450668e11f874cde123e0ecc7db9220f415c2f862de9d544964ac18f52537166"} Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.005816 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55c985789-sc66d" event={"ID":"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151","Type":"ContainerStarted","Data":"f5636b38736faf68c58b387d6d592b7559cb718e71dd665f2518f235c553b6ac"} Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.007082 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hp26d" event={"ID":"ce4667b5-2519-4d80-9357-c54ce473d4ec","Type":"ContainerStarted","Data":"a4eec7dea74df0df1e8324cabc97265a7aa624283371aeb6a5533d9215c73d98"} Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.007115 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hp26d" event={"ID":"ce4667b5-2519-4d80-9357-c54ce473d4ec","Type":"ContainerStarted","Data":"8eaeb73315413bb5d6b1e5c8dff70dc437b544f1ed33144465d27a3728e0044c"} Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.028744 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-m7kpn"] Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.049907 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=22.565539605 podStartE2EDuration="1m1.049885668s" podCreationTimestamp="2025-12-02 09:42:45 +0000 UTC" firstStartedPulling="2025-12-02 09:43:03.527652543 +0000 UTC m=+1346.351526422" lastFinishedPulling="2025-12-02 09:43:42.011998606 +0000 UTC m=+1384.835872485" observedRunningTime="2025-12-02 09:43:46.038937847 +0000 UTC m=+1388.862811726" watchObservedRunningTime="2025-12-02 09:43:46.049885668 +0000 UTC m=+1388.873759547" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.094976 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-hp26d" podStartSLOduration=2.094950357 podStartE2EDuration="2.094950357s" podCreationTimestamp="2025-12-02 09:43:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:43:46.079736582 +0000 UTC m=+1388.903610461" watchObservedRunningTime="2025-12-02 09:43:46.094950357 +0000 UTC m=+1388.918824246" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.248644 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-6cdh4"] Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.277674 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.287669 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bq87j"] Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.425088 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-6cdh4"] Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.442396 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-n7z2z"] Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.443942 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.447331 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.496974 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-n7z2z"] Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.529419 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-br5j5"] Dec 02 09:43:46 crc kubenswrapper[4781]: W1202 09:43:46.531168 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51f721e2_e8fd_4ae1_89d9_fe7272e8246e.slice/crio-a15abac7b995faeb5523e72dc55c2a15e6a85dcbb152674154c9d0429a767f15 WatchSource:0}: Error finding container a15abac7b995faeb5523e72dc55c2a15e6a85dcbb152674154c9d0429a767f15: Status 404 returned error can't find the container with id a15abac7b995faeb5523e72dc55c2a15e6a85dcbb152674154c9d0429a767f15 Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.557549 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78d5dd47fc-gnfbl"] Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.584973 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.585082 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpt2f\" (UniqueName: \"kubernetes.io/projected/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-kube-api-access-dpt2f\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.585122 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.585149 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-config\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.585205 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.585247 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.686782 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.686894 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpt2f\" (UniqueName: \"kubernetes.io/projected/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-kube-api-access-dpt2f\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.687045 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.687086 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-config\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.687160 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.687196 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.688372 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.689354 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-config\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.689897 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.690825 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.690878 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.706834 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpt2f\" (UniqueName: \"kubernetes.io/projected/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-kube-api-access-dpt2f\") pod \"dnsmasq-dns-58dd9ff6bc-n7z2z\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.822947 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:46 crc kubenswrapper[4781]: I1202 09:43:46.950431 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55c985789-sc66d"] Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.029770 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6c9cdcd7b5-xrb7p"] Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.035694 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.067030 4781 generic.go:334] "Generic (PLEG): container finished" podID="6e2bb3c1-1183-46ac-bb83-c552bfb7a874" containerID="21c37a3671354db0b04fa96b32af39e649a28606029a1852971bad9f56553e92" exitCode=0 Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.067147 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-s662m" event={"ID":"6e2bb3c1-1183-46ac-bb83-c552bfb7a874","Type":"ContainerDied","Data":"21c37a3671354db0b04fa96b32af39e649a28606029a1852971bad9f56553e92"} Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.068981 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c9cdcd7b5-xrb7p"] Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.095604 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6914f46f-a737-4507-9293-c490d4249515-config-data\") pod \"horizon-6c9cdcd7b5-xrb7p\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.095671 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cctf6\" (UniqueName: \"kubernetes.io/projected/6914f46f-a737-4507-9293-c490d4249515-kube-api-access-cctf6\") pod \"horizon-6c9cdcd7b5-xrb7p\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.095726 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6914f46f-a737-4507-9293-c490d4249515-logs\") pod \"horizon-6c9cdcd7b5-xrb7p\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.095770 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6914f46f-a737-4507-9293-c490d4249515-horizon-secret-key\") pod \"horizon-6c9cdcd7b5-xrb7p\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.095793 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6914f46f-a737-4507-9293-c490d4249515-scripts\") pod \"horizon-6c9cdcd7b5-xrb7p\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.105057 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-m7kpn" event={"ID":"af2e4c8d-f431-487f-8b70-a2b6e6ee6000","Type":"ContainerStarted","Data":"ad2f700b34454b5aef5f81ba32cbc70d4ba40548ab70bb0da5e5c6f8388bfdbd"} Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.136319 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mfnct" event={"ID":"c8e968b8-6eb0-41d7-beed-8b2bf7006359","Type":"ContainerStarted","Data":"8bafab971d14d7d06b2d33414c80695d46381f1033247ba6a70d9c9f4e3f08d9"} Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.187220 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.220350 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-br5j5" event={"ID":"51f721e2-e8fd-4ae1-89d9-fe7272e8246e","Type":"ContainerStarted","Data":"a15abac7b995faeb5523e72dc55c2a15e6a85dcbb152674154c9d0429a767f15"} Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.227302 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6914f46f-a737-4507-9293-c490d4249515-logs\") pod \"horizon-6c9cdcd7b5-xrb7p\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.227469 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6914f46f-a737-4507-9293-c490d4249515-horizon-secret-key\") pod \"horizon-6c9cdcd7b5-xrb7p\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.227525 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6914f46f-a737-4507-9293-c490d4249515-scripts\") pod \"horizon-6c9cdcd7b5-xrb7p\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.227607 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6914f46f-a737-4507-9293-c490d4249515-config-data\") pod \"horizon-6c9cdcd7b5-xrb7p\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.227756 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6914f46f-a737-4507-9293-c490d4249515-logs\") pod \"horizon-6c9cdcd7b5-xrb7p\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.228376 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cctf6\" (UniqueName: \"kubernetes.io/projected/6914f46f-a737-4507-9293-c490d4249515-kube-api-access-cctf6\") pod \"horizon-6c9cdcd7b5-xrb7p\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.232019 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6914f46f-a737-4507-9293-c490d4249515-config-data\") pod \"horizon-6c9cdcd7b5-xrb7p\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.232715 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6914f46f-a737-4507-9293-c490d4249515-scripts\") pod \"horizon-6c9cdcd7b5-xrb7p\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.240882 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78d5dd47fc-gnfbl" event={"ID":"e769104b-928b-4610-851f-edf65d3071be","Type":"ContainerStarted","Data":"e360649384178cdfefe677143ff2e32122a2085c16e2cfc87b2ba4a6618c7918"} Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.259598 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bq87j" event={"ID":"9418aedd-84eb-4ff9-89f8-831695e5471e","Type":"ContainerStarted","Data":"73cfe805e9a33f292e6d11a608d52bd590a3b4abc1f956f129f0e2fce57f5403"} Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.297718 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6914f46f-a737-4507-9293-c490d4249515-horizon-secret-key\") pod \"horizon-6c9cdcd7b5-xrb7p\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.373848 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cctf6\" (UniqueName: \"kubernetes.io/projected/6914f46f-a737-4507-9293-c490d4249515-kube-api-access-cctf6\") pod \"horizon-6c9cdcd7b5-xrb7p\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.390559 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" event={"ID":"39e827e8-c246-4099-8dd5-edede022aa47","Type":"ContainerStarted","Data":"a4ebecec2610040d7e961c7cc14386aed5ba565a8260378e39e997c25d60d188"} Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.394333 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c98b4bc6-c086-4663-b794-92a36b0da2ba","Type":"ContainerStarted","Data":"44e12cdd3a8c0000cc4a26e70745048b4b823c0c4c8169ad69d09a356dc909a9"} Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.402511 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.611812 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-mfnct" podStartSLOduration=3.611777132 podStartE2EDuration="3.611777132s" podCreationTimestamp="2025-12-02 09:43:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:43:47.245663584 +0000 UTC m=+1390.069537483" watchObservedRunningTime="2025-12-02 09:43:47.611777132 +0000 UTC m=+1390.435651011" Dec 02 09:43:47 crc kubenswrapper[4781]: I1202 09:43:47.988173 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:48 crc kubenswrapper[4781]: W1202 09:43:48.065166 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3fc9b2b_26bf_4f7d_b052_c976f8584c43.slice/crio-ebd4a9eae8deabbc1b5a96e0bc8a7d930c0ecdc362d3990bddbbb8bc009626ff WatchSource:0}: Error finding container ebd4a9eae8deabbc1b5a96e0bc8a7d930c0ecdc362d3990bddbbb8bc009626ff: Status 404 returned error can't find the container with id ebd4a9eae8deabbc1b5a96e0bc8a7d930c0ecdc362d3990bddbbb8bc009626ff Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.066824 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-n7z2z"] Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.098015 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-dns-svc\") pod \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.098491 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-config\") pod \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.098519 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-ovsdbserver-sb\") pod \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.098597 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-ovsdbserver-nb\") pod \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.098643 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l85b\" (UniqueName: \"kubernetes.io/projected/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-kube-api-access-9l85b\") pod \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\" (UID: \"6e2bb3c1-1183-46ac-bb83-c552bfb7a874\") " Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.107741 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-kube-api-access-9l85b" (OuterVolumeSpecName: "kube-api-access-9l85b") pod "6e2bb3c1-1183-46ac-bb83-c552bfb7a874" (UID: "6e2bb3c1-1183-46ac-bb83-c552bfb7a874"). InnerVolumeSpecName "kube-api-access-9l85b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.133717 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6e2bb3c1-1183-46ac-bb83-c552bfb7a874" (UID: "6e2bb3c1-1183-46ac-bb83-c552bfb7a874"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.141283 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6e2bb3c1-1183-46ac-bb83-c552bfb7a874" (UID: "6e2bb3c1-1183-46ac-bb83-c552bfb7a874"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.173709 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-config" (OuterVolumeSpecName: "config") pod "6e2bb3c1-1183-46ac-bb83-c552bfb7a874" (UID: "6e2bb3c1-1183-46ac-bb83-c552bfb7a874"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.177024 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6e2bb3c1-1183-46ac-bb83-c552bfb7a874" (UID: "6e2bb3c1-1183-46ac-bb83-c552bfb7a874"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.200728 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.200775 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.200789 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.200806 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l85b\" (UniqueName: \"kubernetes.io/projected/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-kube-api-access-9l85b\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.200820 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2bb3c1-1183-46ac-bb83-c552bfb7a874-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.314521 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c9cdcd7b5-xrb7p"] Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.453788 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-s662m" event={"ID":"6e2bb3c1-1183-46ac-bb83-c552bfb7a874","Type":"ContainerDied","Data":"99b10a647ae83815805e13fe56edcfa81c8612be5319193b3eed84af8ef26300"} Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.453837 4781 scope.go:117] "RemoveContainer" containerID="21c37a3671354db0b04fa96b32af39e649a28606029a1852971bad9f56553e92" Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.453801 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-s662m" Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.459551 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" event={"ID":"a3fc9b2b-26bf-4f7d-b052-c976f8584c43","Type":"ContainerStarted","Data":"ebd4a9eae8deabbc1b5a96e0bc8a7d930c0ecdc362d3990bddbbb8bc009626ff"} Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.462317 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c9cdcd7b5-xrb7p" event={"ID":"6914f46f-a737-4507-9293-c490d4249515","Type":"ContainerStarted","Data":"4378cbb55460cd97c293658f3e9bd4d40bdf78d7230054b47c52e83e4edbb6cb"} Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.475293 4781 generic.go:334] "Generic (PLEG): container finished" podID="39e827e8-c246-4099-8dd5-edede022aa47" containerID="7aea31a46d1d0f5eb5456795bda60769abd61bd8940e9855588119c694b5c338" exitCode=0 Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.476646 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" event={"ID":"39e827e8-c246-4099-8dd5-edede022aa47","Type":"ContainerDied","Data":"7aea31a46d1d0f5eb5456795bda60769abd61bd8940e9855588119c694b5c338"} Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.578432 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-s662m"] Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.591267 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-s662m"] Dec 02 09:43:48 crc kubenswrapper[4781]: I1202 09:43:48.950648 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.022275 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-ovsdbserver-sb\") pod \"39e827e8-c246-4099-8dd5-edede022aa47\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.022407 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-config\") pod \"39e827e8-c246-4099-8dd5-edede022aa47\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.022450 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-ovsdbserver-nb\") pod \"39e827e8-c246-4099-8dd5-edede022aa47\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.022582 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-dns-svc\") pod \"39e827e8-c246-4099-8dd5-edede022aa47\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.022666 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rq2b\" (UniqueName: \"kubernetes.io/projected/39e827e8-c246-4099-8dd5-edede022aa47-kube-api-access-2rq2b\") pod \"39e827e8-c246-4099-8dd5-edede022aa47\" (UID: \"39e827e8-c246-4099-8dd5-edede022aa47\") " Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.039393 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39e827e8-c246-4099-8dd5-edede022aa47-kube-api-access-2rq2b" (OuterVolumeSpecName: "kube-api-access-2rq2b") pod "39e827e8-c246-4099-8dd5-edede022aa47" (UID: "39e827e8-c246-4099-8dd5-edede022aa47"). InnerVolumeSpecName "kube-api-access-2rq2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.067182 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "39e827e8-c246-4099-8dd5-edede022aa47" (UID: "39e827e8-c246-4099-8dd5-edede022aa47"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.070247 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "39e827e8-c246-4099-8dd5-edede022aa47" (UID: "39e827e8-c246-4099-8dd5-edede022aa47"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.083642 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-config" (OuterVolumeSpecName: "config") pod "39e827e8-c246-4099-8dd5-edede022aa47" (UID: "39e827e8-c246-4099-8dd5-edede022aa47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.089810 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "39e827e8-c246-4099-8dd5-edede022aa47" (UID: "39e827e8-c246-4099-8dd5-edede022aa47"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.133608 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.133649 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rq2b\" (UniqueName: \"kubernetes.io/projected/39e827e8-c246-4099-8dd5-edede022aa47-kube-api-access-2rq2b\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.133670 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.133681 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.133693 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39e827e8-c246-4099-8dd5-edede022aa47-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.495293 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.495377 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-6cdh4" event={"ID":"39e827e8-c246-4099-8dd5-edede022aa47","Type":"ContainerDied","Data":"a4ebecec2610040d7e961c7cc14386aed5ba565a8260378e39e997c25d60d188"} Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.495428 4781 scope.go:117] "RemoveContainer" containerID="7aea31a46d1d0f5eb5456795bda60769abd61bd8940e9855588119c694b5c338" Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.510998 4781 generic.go:334] "Generic (PLEG): container finished" podID="a3fc9b2b-26bf-4f7d-b052-c976f8584c43" containerID="009eb4629af2ac941e9f4f81e6768ce1a094bb826a60e3fd927574406f7a67be" exitCode=0 Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.571710 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e2bb3c1-1183-46ac-bb83-c552bfb7a874" path="/var/lib/kubelet/pods/6e2bb3c1-1183-46ac-bb83-c552bfb7a874/volumes" Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.573298 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" event={"ID":"a3fc9b2b-26bf-4f7d-b052-c976f8584c43","Type":"ContainerDied","Data":"009eb4629af2ac941e9f4f81e6768ce1a094bb826a60e3fd927574406f7a67be"} Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.614533 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-6cdh4"] Dec 02 09:43:49 crc kubenswrapper[4781]: I1202 09:43:49.644719 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-6cdh4"] Dec 02 09:43:50 crc kubenswrapper[4781]: I1202 09:43:50.583820 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" event={"ID":"a3fc9b2b-26bf-4f7d-b052-c976f8584c43","Type":"ContainerStarted","Data":"f8112c905f7061e6694008c35ca2b50d6689e96ec0f9b105c3091f7f17b92eac"} Dec 02 09:43:50 crc kubenswrapper[4781]: I1202 09:43:50.584152 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:43:51 crc kubenswrapper[4781]: I1202 09:43:51.513215 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39e827e8-c246-4099-8dd5-edede022aa47" path="/var/lib/kubelet/pods/39e827e8-c246-4099-8dd5-edede022aa47/volumes" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.629069 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" podStartSLOduration=7.629046543 podStartE2EDuration="7.629046543s" podCreationTimestamp="2025-12-02 09:43:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:43:50.620310875 +0000 UTC m=+1393.444184774" watchObservedRunningTime="2025-12-02 09:43:53.629046543 +0000 UTC m=+1396.452920422" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.637036 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-78d5dd47fc-gnfbl"] Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.661505 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67cff9c6-hg2l6"] Dec 02 09:43:53 crc kubenswrapper[4781]: E1202 09:43:53.661905 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e827e8-c246-4099-8dd5-edede022aa47" containerName="init" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.661948 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e827e8-c246-4099-8dd5-edede022aa47" containerName="init" Dec 02 09:43:53 crc kubenswrapper[4781]: E1202 09:43:53.661985 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2bb3c1-1183-46ac-bb83-c552bfb7a874" containerName="init" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.661994 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2bb3c1-1183-46ac-bb83-c552bfb7a874" containerName="init" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.662220 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e2bb3c1-1183-46ac-bb83-c552bfb7a874" containerName="init" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.662241 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="39e827e8-c246-4099-8dd5-edede022aa47" containerName="init" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.664100 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.667547 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.683859 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67cff9c6-hg2l6"] Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.724681 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c9cdcd7b5-xrb7p"] Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.739627 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6856678494-4cprv"] Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.741450 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.768089 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6856678494-4cprv"] Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.849060 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p44t6\" (UniqueName: \"kubernetes.io/projected/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-kube-api-access-p44t6\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.849137 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-horizon-tls-certs\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.849164 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w6k2\" (UniqueName: \"kubernetes.io/projected/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-kube-api-access-6w6k2\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.849756 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-scripts\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.849829 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-config-data\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.849856 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-logs\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.849991 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-logs\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.850037 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-config-data\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.850074 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-combined-ca-bundle\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.850094 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-scripts\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.850118 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-horizon-tls-certs\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.850249 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-horizon-secret-key\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.850306 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-combined-ca-bundle\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.850347 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-horizon-secret-key\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.951385 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-scripts\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.951442 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-config-data\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.951483 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-logs\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.951523 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-logs\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.951552 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-config-data\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.951582 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-combined-ca-bundle\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.951603 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-scripts\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.951631 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-horizon-tls-certs\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.951657 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-horizon-secret-key\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.951677 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-combined-ca-bundle\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.951703 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-horizon-secret-key\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.951791 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p44t6\" (UniqueName: \"kubernetes.io/projected/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-kube-api-access-p44t6\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.951823 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-horizon-tls-certs\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.951851 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w6k2\" (UniqueName: \"kubernetes.io/projected/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-kube-api-access-6w6k2\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.952159 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-logs\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.952327 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-logs\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.952475 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-scripts\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.953005 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-config-data\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.953455 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-config-data\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.953809 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-scripts\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.957820 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-horizon-secret-key\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.958705 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-combined-ca-bundle\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.959514 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-horizon-secret-key\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.965661 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-horizon-tls-certs\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.966663 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-combined-ca-bundle\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.970188 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-horizon-tls-certs\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.970367 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w6k2\" (UniqueName: \"kubernetes.io/projected/226317c7-a6f4-43c5-a3df-c9cb18b3afa5-kube-api-access-6w6k2\") pod \"horizon-6856678494-4cprv\" (UID: \"226317c7-a6f4-43c5-a3df-c9cb18b3afa5\") " pod="openstack/horizon-6856678494-4cprv" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.973025 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p44t6\" (UniqueName: \"kubernetes.io/projected/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-kube-api-access-p44t6\") pod \"horizon-67cff9c6-hg2l6\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:53 crc kubenswrapper[4781]: I1202 09:43:53.989473 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:43:54 crc kubenswrapper[4781]: I1202 09:43:54.063657 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6856678494-4cprv" Dec 02 09:44:02 crc kubenswrapper[4781]: I1202 09:43:56.824100 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:44:02 crc kubenswrapper[4781]: I1202 09:43:56.887679 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-879pw"] Dec 02 09:44:02 crc kubenswrapper[4781]: I1202 09:43:56.887935 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-879pw" podUID="6efb5424-156a-4d85-865b-f057a1bdf098" containerName="dnsmasq-dns" containerID="cri-o://10dae2396adaf19a5e67a1d6d7232adc47b131f7ec741b75de3e2361e60012ec" gracePeriod=10 Dec 02 09:44:02 crc kubenswrapper[4781]: I1202 09:43:59.714411 4781 generic.go:334] "Generic (PLEG): container finished" podID="6efb5424-156a-4d85-865b-f057a1bdf098" containerID="10dae2396adaf19a5e67a1d6d7232adc47b131f7ec741b75de3e2361e60012ec" exitCode=0 Dec 02 09:44:02 crc kubenswrapper[4781]: I1202 09:43:59.714480 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-879pw" event={"ID":"6efb5424-156a-4d85-865b-f057a1bdf098","Type":"ContainerDied","Data":"10dae2396adaf19a5e67a1d6d7232adc47b131f7ec741b75de3e2361e60012ec"} Dec 02 09:44:02 crc kubenswrapper[4781]: I1202 09:44:00.411996 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:44:02 crc kubenswrapper[4781]: I1202 09:44:00.412068 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:44:02 crc kubenswrapper[4781]: I1202 09:44:01.185828 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-879pw" podUID="6efb5424-156a-4d85-865b-f057a1bdf098" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Dec 02 09:44:02 crc kubenswrapper[4781]: E1202 09:44:02.196365 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Dec 02 09:44:02 crc kubenswrapper[4781]: E1202 09:44:02.196821 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j4hqp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-m7kpn_openstack(af2e4c8d-f431-487f-8b70-a2b6e6ee6000): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:44:02 crc kubenswrapper[4781]: E1202 09:44:02.198240 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-m7kpn" podUID="af2e4c8d-f431-487f-8b70-a2b6e6ee6000" Dec 02 09:44:02 crc kubenswrapper[4781]: I1202 09:44:02.561835 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6856678494-4cprv"] Dec 02 09:44:02 crc kubenswrapper[4781]: I1202 09:44:02.572333 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67cff9c6-hg2l6"] Dec 02 09:44:02 crc kubenswrapper[4781]: E1202 09:44:02.756423 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-m7kpn" podUID="af2e4c8d-f431-487f-8b70-a2b6e6ee6000" Dec 02 09:44:03 crc kubenswrapper[4781]: I1202 09:44:03.766976 4781 generic.go:334] "Generic (PLEG): container finished" podID="ce4667b5-2519-4d80-9357-c54ce473d4ec" containerID="a4eec7dea74df0df1e8324cabc97265a7aa624283371aeb6a5533d9215c73d98" exitCode=0 Dec 02 09:44:03 crc kubenswrapper[4781]: I1202 09:44:03.767077 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hp26d" event={"ID":"ce4667b5-2519-4d80-9357-c54ce473d4ec","Type":"ContainerDied","Data":"a4eec7dea74df0df1e8324cabc97265a7aa624283371aeb6a5533d9215c73d98"} Dec 02 09:44:06 crc kubenswrapper[4781]: I1202 09:44:06.185015 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-879pw" podUID="6efb5424-156a-4d85-865b-f057a1bdf098" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Dec 02 09:44:10 crc kubenswrapper[4781]: E1202 09:44:10.533369 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 02 09:44:10 crc kubenswrapper[4781]: E1202 09:44:10.534173 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5d4h77h578h59ch695hdbh65dh588h5f7h79h79hf4h85h5bdh586h6dh9bh644hf8h98hbbh68fh696h67ch5bh84h694hdh664h56ch58dh58q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4xv8c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-55c985789-sc66d_openstack(8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:44:10 crc kubenswrapper[4781]: E1202 09:44:10.538531 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-55c985789-sc66d" podUID="8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151" Dec 02 09:44:16 crc kubenswrapper[4781]: I1202 09:44:16.185351 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-879pw" podUID="6efb5424-156a-4d85-865b-f057a1bdf098" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Dec 02 09:44:16 crc kubenswrapper[4781]: I1202 09:44:16.186059 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:44:21 crc kubenswrapper[4781]: I1202 09:44:21.186065 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-879pw" podUID="6efb5424-156a-4d85-865b-f057a1bdf098" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Dec 02 09:44:21 crc kubenswrapper[4781]: E1202 09:44:21.875502 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 02 09:44:21 crc kubenswrapper[4781]: E1202 09:44:21.876024 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5f5h5h597h5f4h64dh57hffh684hbdh567h5c9h5bbh5b7h65ch5d8h64dh5c7hc7h95h69h88h9chdfh6ch55dh576hf9h5ch597hdh58hd9q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vr88t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-78d5dd47fc-gnfbl_openstack(e769104b-928b-4610-851f-edf65d3071be): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:44:21 crc kubenswrapper[4781]: E1202 09:44:21.879132 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-78d5dd47fc-gnfbl" podUID="e769104b-928b-4610-851f-edf65d3071be" Dec 02 09:44:26 crc kubenswrapper[4781]: E1202 09:44:26.078659 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 02 09:44:26 crc kubenswrapper[4781]: E1202 09:44:26.079635 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n59fh549h54ch58bhfbh67dh55h664h59ch6ch668h5f6hb6h565h5c9h56bh569h588h57bh5d6h57h7bh58bh8fh88hd6h556h66dh65ch8fh78h6bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cctf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6c9cdcd7b5-xrb7p_openstack(6914f46f-a737-4507-9293-c490d4249515): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:44:26 crc kubenswrapper[4781]: E1202 09:44:26.083290 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6c9cdcd7b5-xrb7p" podUID="6914f46f-a737-4507-9293-c490d4249515" Dec 02 09:44:26 crc kubenswrapper[4781]: I1202 09:44:26.188264 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-879pw" podUID="6efb5424-156a-4d85-865b-f057a1bdf098" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Dec 02 09:44:30 crc kubenswrapper[4781]: I1202 09:44:30.412400 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:44:30 crc kubenswrapper[4781]: I1202 09:44:30.412898 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:44:30 crc kubenswrapper[4781]: I1202 09:44:30.412973 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:44:30 crc kubenswrapper[4781]: I1202 09:44:30.413635 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"da048919704efdd3fa5ef85ad71184cb8e182089a953151e1c406db7fb59ab6e"} pod="openshift-machine-config-operator/machine-config-daemon-pzntm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:44:30 crc kubenswrapper[4781]: I1202 09:44:30.413689 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" containerID="cri-o://da048919704efdd3fa5ef85ad71184cb8e182089a953151e1c406db7fb59ab6e" gracePeriod=600 Dec 02 09:44:31 crc kubenswrapper[4781]: I1202 09:44:31.189780 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-879pw" podUID="6efb5424-156a-4d85-865b-f057a1bdf098" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Dec 02 09:44:31 crc kubenswrapper[4781]: I1202 09:44:31.991639 4781 generic.go:334] "Generic (PLEG): container finished" podID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerID="da048919704efdd3fa5ef85ad71184cb8e182089a953151e1c406db7fb59ab6e" exitCode=0 Dec 02 09:44:31 crc kubenswrapper[4781]: I1202 09:44:31.991680 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerDied","Data":"da048919704efdd3fa5ef85ad71184cb8e182089a953151e1c406db7fb59ab6e"} Dec 02 09:44:31 crc kubenswrapper[4781]: I1202 09:44:31.994250 4781 scope.go:117] "RemoveContainer" containerID="065de47b4d4ea37363c4ad12a4f2a35f47f5c31ffb6b9b5188b40ed91f236ce5" Dec 02 09:44:33 crc kubenswrapper[4781]: W1202 09:44:33.366023 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod226317c7_a6f4_43c5_a3df_c9cb18b3afa5.slice/crio-b43c6c11df5c25a70a232fa720d3a5c591abe0f55054a8e363efc5482ea97f2d WatchSource:0}: Error finding container b43c6c11df5c25a70a232fa720d3a5c591abe0f55054a8e363efc5482ea97f2d: Status 404 returned error can't find the container with id b43c6c11df5c25a70a232fa720d3a5c591abe0f55054a8e363efc5482ea97f2d Dec 02 09:44:33 crc kubenswrapper[4781]: W1202 09:44:33.371683 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dcfdf79_19b4_4ea4_9b16_67d79aa7165e.slice/crio-1ebbd24bae131584f2350f9b6736510ed7051f24c06f451acfb3476d5c35fd15 WatchSource:0}: Error finding container 1ebbd24bae131584f2350f9b6736510ed7051f24c06f451acfb3476d5c35fd15: Status 404 returned error can't find the container with id 1ebbd24bae131584f2350f9b6736510ed7051f24c06f451acfb3476d5c35fd15 Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.372154 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.467207 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.476293 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55c985789-sc66d" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.657692 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-fernet-keys\") pod \"ce4667b5-2519-4d80-9357-c54ce473d4ec\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.657765 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-config-data\") pod \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.657805 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xv8c\" (UniqueName: \"kubernetes.io/projected/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-kube-api-access-4xv8c\") pod \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.657864 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-scripts\") pod \"ce4667b5-2519-4d80-9357-c54ce473d4ec\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.657941 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2nsb\" (UniqueName: \"kubernetes.io/projected/ce4667b5-2519-4d80-9357-c54ce473d4ec-kube-api-access-c2nsb\") pod \"ce4667b5-2519-4d80-9357-c54ce473d4ec\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.657969 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-logs\") pod \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.657996 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-credential-keys\") pod \"ce4667b5-2519-4d80-9357-c54ce473d4ec\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.658044 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-combined-ca-bundle\") pod \"ce4667b5-2519-4d80-9357-c54ce473d4ec\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.658073 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-config-data\") pod \"ce4667b5-2519-4d80-9357-c54ce473d4ec\" (UID: \"ce4667b5-2519-4d80-9357-c54ce473d4ec\") " Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.658119 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-horizon-secret-key\") pod \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.658143 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-scripts\") pod \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\" (UID: \"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151\") " Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.659960 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-logs" (OuterVolumeSpecName: "logs") pod "8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151" (UID: "8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.660127 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-config-data" (OuterVolumeSpecName: "config-data") pod "8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151" (UID: "8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.661506 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-scripts" (OuterVolumeSpecName: "scripts") pod "8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151" (UID: "8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.664557 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ce4667b5-2519-4d80-9357-c54ce473d4ec" (UID: "ce4667b5-2519-4d80-9357-c54ce473d4ec"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.664660 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-scripts" (OuterVolumeSpecName: "scripts") pod "ce4667b5-2519-4d80-9357-c54ce473d4ec" (UID: "ce4667b5-2519-4d80-9357-c54ce473d4ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.665142 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ce4667b5-2519-4d80-9357-c54ce473d4ec" (UID: "ce4667b5-2519-4d80-9357-c54ce473d4ec"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.665611 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151" (UID: "8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.667229 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-kube-api-access-4xv8c" (OuterVolumeSpecName: "kube-api-access-4xv8c") pod "8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151" (UID: "8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151"). InnerVolumeSpecName "kube-api-access-4xv8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.670286 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce4667b5-2519-4d80-9357-c54ce473d4ec-kube-api-access-c2nsb" (OuterVolumeSpecName: "kube-api-access-c2nsb") pod "ce4667b5-2519-4d80-9357-c54ce473d4ec" (UID: "ce4667b5-2519-4d80-9357-c54ce473d4ec"). InnerVolumeSpecName "kube-api-access-c2nsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.688308 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-config-data" (OuterVolumeSpecName: "config-data") pod "ce4667b5-2519-4d80-9357-c54ce473d4ec" (UID: "ce4667b5-2519-4d80-9357-c54ce473d4ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.702822 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce4667b5-2519-4d80-9357-c54ce473d4ec" (UID: "ce4667b5-2519-4d80-9357-c54ce473d4ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.760873 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2nsb\" (UniqueName: \"kubernetes.io/projected/ce4667b5-2519-4d80-9357-c54ce473d4ec-kube-api-access-c2nsb\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.761086 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.761194 4781 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.761264 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.761326 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.761387 4781 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.761449 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.761509 4781 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.761574 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.761721 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xv8c\" (UniqueName: \"kubernetes.io/projected/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151-kube-api-access-4xv8c\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:33 crc kubenswrapper[4781]: I1202 09:44:33.761776 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce4667b5-2519-4d80-9357-c54ce473d4ec-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.011590 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6856678494-4cprv" event={"ID":"226317c7-a6f4-43c5-a3df-c9cb18b3afa5","Type":"ContainerStarted","Data":"b43c6c11df5c25a70a232fa720d3a5c591abe0f55054a8e363efc5482ea97f2d"} Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.012910 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55c985789-sc66d" event={"ID":"8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151","Type":"ContainerDied","Data":"f5636b38736faf68c58b387d6d592b7559cb718e71dd665f2518f235c553b6ac"} Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.012964 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55c985789-sc66d" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.019680 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67cff9c6-hg2l6" event={"ID":"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e","Type":"ContainerStarted","Data":"1ebbd24bae131584f2350f9b6736510ed7051f24c06f451acfb3476d5c35fd15"} Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.021527 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hp26d" event={"ID":"ce4667b5-2519-4d80-9357-c54ce473d4ec","Type":"ContainerDied","Data":"8eaeb73315413bb5d6b1e5c8dff70dc437b544f1ed33144465d27a3728e0044c"} Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.021551 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8eaeb73315413bb5d6b1e5c8dff70dc437b544f1ed33144465d27a3728e0044c" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.021584 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hp26d" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.076061 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55c985789-sc66d"] Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.086863 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-55c985789-sc66d"] Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.543073 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-hp26d"] Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.550387 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-hp26d"] Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.658075 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-sqn2b"] Dec 02 09:44:34 crc kubenswrapper[4781]: E1202 09:44:34.658554 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4667b5-2519-4d80-9357-c54ce473d4ec" containerName="keystone-bootstrap" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.658579 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4667b5-2519-4d80-9357-c54ce473d4ec" containerName="keystone-bootstrap" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.658795 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce4667b5-2519-4d80-9357-c54ce473d4ec" containerName="keystone-bootstrap" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.659591 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.661910 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.665509 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q5lzd" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.665706 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.665880 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.666048 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.675731 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sqn2b"] Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.778740 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-credential-keys\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.778812 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-fernet-keys\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.778855 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69b6s\" (UniqueName: \"kubernetes.io/projected/855c82ea-ccc1-4981-b82d-b2d9aac387a6-kube-api-access-69b6s\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.778983 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-config-data\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.779089 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-combined-ca-bundle\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.779120 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-scripts\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.882614 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-config-data\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.882767 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-combined-ca-bundle\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.882802 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-scripts\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.882835 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-credential-keys\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.882866 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-fernet-keys\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.882909 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69b6s\" (UniqueName: \"kubernetes.io/projected/855c82ea-ccc1-4981-b82d-b2d9aac387a6-kube-api-access-69b6s\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.889336 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-credential-keys\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.889663 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-scripts\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.889719 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-fernet-keys\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.889719 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-config-data\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.901640 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-combined-ca-bundle\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:34 crc kubenswrapper[4781]: I1202 09:44:34.903536 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69b6s\" (UniqueName: \"kubernetes.io/projected/855c82ea-ccc1-4981-b82d-b2d9aac387a6-kube-api-access-69b6s\") pod \"keystone-bootstrap-sqn2b\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:35 crc kubenswrapper[4781]: I1202 09:44:35.021427 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:35 crc kubenswrapper[4781]: I1202 09:44:35.520955 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151" path="/var/lib/kubelet/pods/8cc4edc4-a0d8-4f65-b3aa-d1ef237e6151/volumes" Dec 02 09:44:35 crc kubenswrapper[4781]: I1202 09:44:35.521426 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce4667b5-2519-4d80-9357-c54ce473d4ec" path="/var/lib/kubelet/pods/ce4667b5-2519-4d80-9357-c54ce473d4ec/volumes" Dec 02 09:44:35 crc kubenswrapper[4781]: E1202 09:44:35.639979 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 02 09:44:35 crc kubenswrapper[4781]: E1202 09:44:35.640272 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h4xvv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-bq87j_openstack(9418aedd-84eb-4ff9-89f8-831695e5471e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:44:35 crc kubenswrapper[4781]: E1202 09:44:35.641492 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-bq87j" podUID="9418aedd-84eb-4ff9-89f8-831695e5471e" Dec 02 09:44:36 crc kubenswrapper[4781]: E1202 09:44:36.037045 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-bq87j" podUID="9418aedd-84eb-4ff9-89f8-831695e5471e" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.191083 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-879pw" podUID="6efb5424-156a-4d85-865b-f057a1bdf098" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Dec 02 09:44:36 crc kubenswrapper[4781]: E1202 09:44:36.194733 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 02 09:44:36 crc kubenswrapper[4781]: E1202 09:44:36.194938 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8ch5b4hb6h5f4h5dch669h57h645h7ch688h8h659hd8hdch5fdh5fh594hc4h665h85h58bh586h5dfh5f4h645h68dh98h84hd7h5bbh669h5fbq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f8tv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(c98b4bc6-c086-4663-b794-92a36b0da2ba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.224862 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.236444 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.306107 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6914f46f-a737-4507-9293-c490d4249515-config-data\") pod \"6914f46f-a737-4507-9293-c490d4249515\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.306199 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6914f46f-a737-4507-9293-c490d4249515-logs\") pod \"6914f46f-a737-4507-9293-c490d4249515\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.306230 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6914f46f-a737-4507-9293-c490d4249515-horizon-secret-key\") pod \"6914f46f-a737-4507-9293-c490d4249515\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.306254 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6914f46f-a737-4507-9293-c490d4249515-scripts\") pod \"6914f46f-a737-4507-9293-c490d4249515\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.306287 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cctf6\" (UniqueName: \"kubernetes.io/projected/6914f46f-a737-4507-9293-c490d4249515-kube-api-access-cctf6\") pod \"6914f46f-a737-4507-9293-c490d4249515\" (UID: \"6914f46f-a737-4507-9293-c490d4249515\") " Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.306891 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6914f46f-a737-4507-9293-c490d4249515-config-data" (OuterVolumeSpecName: "config-data") pod "6914f46f-a737-4507-9293-c490d4249515" (UID: "6914f46f-a737-4507-9293-c490d4249515"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.306974 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6914f46f-a737-4507-9293-c490d4249515-logs" (OuterVolumeSpecName: "logs") pod "6914f46f-a737-4507-9293-c490d4249515" (UID: "6914f46f-a737-4507-9293-c490d4249515"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.307096 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6914f46f-a737-4507-9293-c490d4249515-scripts" (OuterVolumeSpecName: "scripts") pod "6914f46f-a737-4507-9293-c490d4249515" (UID: "6914f46f-a737-4507-9293-c490d4249515"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.317128 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6914f46f-a737-4507-9293-c490d4249515-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6914f46f-a737-4507-9293-c490d4249515" (UID: "6914f46f-a737-4507-9293-c490d4249515"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.318476 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6914f46f-a737-4507-9293-c490d4249515-kube-api-access-cctf6" (OuterVolumeSpecName: "kube-api-access-cctf6") pod "6914f46f-a737-4507-9293-c490d4249515" (UID: "6914f46f-a737-4507-9293-c490d4249515"). InnerVolumeSpecName "kube-api-access-cctf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.408083 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr88t\" (UniqueName: \"kubernetes.io/projected/e769104b-928b-4610-851f-edf65d3071be-kube-api-access-vr88t\") pod \"e769104b-928b-4610-851f-edf65d3071be\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.408200 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e769104b-928b-4610-851f-edf65d3071be-config-data\") pod \"e769104b-928b-4610-851f-edf65d3071be\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.408269 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e769104b-928b-4610-851f-edf65d3071be-logs\") pod \"e769104b-928b-4610-851f-edf65d3071be\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.408364 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e769104b-928b-4610-851f-edf65d3071be-scripts\") pod \"e769104b-928b-4610-851f-edf65d3071be\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.408441 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e769104b-928b-4610-851f-edf65d3071be-horizon-secret-key\") pod \"e769104b-928b-4610-851f-edf65d3071be\" (UID: \"e769104b-928b-4610-851f-edf65d3071be\") " Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.408935 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6914f46f-a737-4507-9293-c490d4249515-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.408960 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6914f46f-a737-4507-9293-c490d4249515-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.408973 4781 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6914f46f-a737-4507-9293-c490d4249515-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.408988 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6914f46f-a737-4507-9293-c490d4249515-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.408998 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cctf6\" (UniqueName: \"kubernetes.io/projected/6914f46f-a737-4507-9293-c490d4249515-kube-api-access-cctf6\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.409207 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e769104b-928b-4610-851f-edf65d3071be-logs" (OuterVolumeSpecName: "logs") pod "e769104b-928b-4610-851f-edf65d3071be" (UID: "e769104b-928b-4610-851f-edf65d3071be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.409444 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e769104b-928b-4610-851f-edf65d3071be-scripts" (OuterVolumeSpecName: "scripts") pod "e769104b-928b-4610-851f-edf65d3071be" (UID: "e769104b-928b-4610-851f-edf65d3071be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.409609 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e769104b-928b-4610-851f-edf65d3071be-config-data" (OuterVolumeSpecName: "config-data") pod "e769104b-928b-4610-851f-edf65d3071be" (UID: "e769104b-928b-4610-851f-edf65d3071be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.412989 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e769104b-928b-4610-851f-edf65d3071be-kube-api-access-vr88t" (OuterVolumeSpecName: "kube-api-access-vr88t") pod "e769104b-928b-4610-851f-edf65d3071be" (UID: "e769104b-928b-4610-851f-edf65d3071be"). InnerVolumeSpecName "kube-api-access-vr88t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.413575 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e769104b-928b-4610-851f-edf65d3071be-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e769104b-928b-4610-851f-edf65d3071be" (UID: "e769104b-928b-4610-851f-edf65d3071be"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.510288 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e769104b-928b-4610-851f-edf65d3071be-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.510319 4781 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e769104b-928b-4610-851f-edf65d3071be-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.510333 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr88t\" (UniqueName: \"kubernetes.io/projected/e769104b-928b-4610-851f-edf65d3071be-kube-api-access-vr88t\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.510345 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e769104b-928b-4610-851f-edf65d3071be-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.510356 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e769104b-928b-4610-851f-edf65d3071be-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:36 crc kubenswrapper[4781]: E1202 09:44:36.646978 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 02 09:44:36 crc kubenswrapper[4781]: E1202 09:44:36.647129 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vq9zt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-br5j5_openstack(51f721e2-e8fd-4ae1-89d9-fe7272e8246e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:44:36 crc kubenswrapper[4781]: E1202 09:44:36.648288 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-br5j5" podUID="51f721e2-e8fd-4ae1-89d9-fe7272e8246e" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.682478 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.815066 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsl76\" (UniqueName: \"kubernetes.io/projected/6efb5424-156a-4d85-865b-f057a1bdf098-kube-api-access-qsl76\") pod \"6efb5424-156a-4d85-865b-f057a1bdf098\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.815130 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-config\") pod \"6efb5424-156a-4d85-865b-f057a1bdf098\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.815200 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-dns-svc\") pod \"6efb5424-156a-4d85-865b-f057a1bdf098\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.815260 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-ovsdbserver-nb\") pod \"6efb5424-156a-4d85-865b-f057a1bdf098\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.815317 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-ovsdbserver-sb\") pod \"6efb5424-156a-4d85-865b-f057a1bdf098\" (UID: \"6efb5424-156a-4d85-865b-f057a1bdf098\") " Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.822239 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6efb5424-156a-4d85-865b-f057a1bdf098-kube-api-access-qsl76" (OuterVolumeSpecName: "kube-api-access-qsl76") pod "6efb5424-156a-4d85-865b-f057a1bdf098" (UID: "6efb5424-156a-4d85-865b-f057a1bdf098"). InnerVolumeSpecName "kube-api-access-qsl76". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.900226 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6efb5424-156a-4d85-865b-f057a1bdf098" (UID: "6efb5424-156a-4d85-865b-f057a1bdf098"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.917326 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6efb5424-156a-4d85-865b-f057a1bdf098" (UID: "6efb5424-156a-4d85-865b-f057a1bdf098"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.920339 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsl76\" (UniqueName: \"kubernetes.io/projected/6efb5424-156a-4d85-865b-f057a1bdf098-kube-api-access-qsl76\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.920409 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.920424 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.928937 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6efb5424-156a-4d85-865b-f057a1bdf098" (UID: "6efb5424-156a-4d85-865b-f057a1bdf098"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:44:36 crc kubenswrapper[4781]: I1202 09:44:36.957762 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-config" (OuterVolumeSpecName: "config") pod "6efb5424-156a-4d85-865b-f057a1bdf098" (UID: "6efb5424-156a-4d85-865b-f057a1bdf098"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.021733 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.021770 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6efb5424-156a-4d85-865b-f057a1bdf098-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.054216 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-m7kpn" event={"ID":"af2e4c8d-f431-487f-8b70-a2b6e6ee6000","Type":"ContainerStarted","Data":"544d424dc3bd43c5b3a55a41876f880fe1ebd6d93cf6b42073aad9a19a8c465a"} Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.061601 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c9cdcd7b5-xrb7p" event={"ID":"6914f46f-a737-4507-9293-c490d4249515","Type":"ContainerDied","Data":"4378cbb55460cd97c293658f3e9bd4d40bdf78d7230054b47c52e83e4edbb6cb"} Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.061611 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c9cdcd7b5-xrb7p" Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.084063 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-m7kpn" podStartSLOduration=2.358288341 podStartE2EDuration="53.084043696s" podCreationTimestamp="2025-12-02 09:43:44 +0000 UTC" firstStartedPulling="2025-12-02 09:43:46.033761999 +0000 UTC m=+1388.857635878" lastFinishedPulling="2025-12-02 09:44:36.759517354 +0000 UTC m=+1439.583391233" observedRunningTime="2025-12-02 09:44:37.076844804 +0000 UTC m=+1439.900718683" watchObservedRunningTime="2025-12-02 09:44:37.084043696 +0000 UTC m=+1439.907917575" Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.084112 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78d5dd47fc-gnfbl" Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.084114 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78d5dd47fc-gnfbl" event={"ID":"e769104b-928b-4610-851f-edf65d3071be","Type":"ContainerDied","Data":"e360649384178cdfefe677143ff2e32122a2085c16e2cfc87b2ba4a6618c7918"} Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.090872 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed"} Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.099799 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-879pw" Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.101598 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-879pw" event={"ID":"6efb5424-156a-4d85-865b-f057a1bdf098","Type":"ContainerDied","Data":"3c4c5cfea9d4d1f6f7c5b7ab6b1f112791b5cffaddcd437c93fe69d2509c1dc0"} Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.101740 4781 scope.go:117] "RemoveContainer" containerID="10dae2396adaf19a5e67a1d6d7232adc47b131f7ec741b75de3e2361e60012ec" Dec 02 09:44:37 crc kubenswrapper[4781]: E1202 09:44:37.102623 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-br5j5" podUID="51f721e2-e8fd-4ae1-89d9-fe7272e8246e" Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.145481 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c9cdcd7b5-xrb7p"] Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.148672 4781 scope.go:117] "RemoveContainer" containerID="8e409c4c46bd9771662c67f2104628cb7084c94e502d3be7ccde3886234402e4" Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.173742 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6c9cdcd7b5-xrb7p"] Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.237911 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sqn2b"] Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.400063 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-78d5dd47fc-gnfbl"] Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.411141 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-78d5dd47fc-gnfbl"] Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.423262 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-879pw"] Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.433273 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-879pw"] Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.550897 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6914f46f-a737-4507-9293-c490d4249515" path="/var/lib/kubelet/pods/6914f46f-a737-4507-9293-c490d4249515/volumes" Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.551693 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6efb5424-156a-4d85-865b-f057a1bdf098" path="/var/lib/kubelet/pods/6efb5424-156a-4d85-865b-f057a1bdf098/volumes" Dec 02 09:44:37 crc kubenswrapper[4781]: I1202 09:44:37.552982 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e769104b-928b-4610-851f-edf65d3071be" path="/var/lib/kubelet/pods/e769104b-928b-4610-851f-edf65d3071be/volumes" Dec 02 09:44:38 crc kubenswrapper[4781]: I1202 09:44:38.109111 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6856678494-4cprv" event={"ID":"226317c7-a6f4-43c5-a3df-c9cb18b3afa5","Type":"ContainerStarted","Data":"05131940adb78458c8c17dda4a6656613263742a5815faf81eced06a09712afc"} Dec 02 09:44:38 crc kubenswrapper[4781]: I1202 09:44:38.109153 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6856678494-4cprv" event={"ID":"226317c7-a6f4-43c5-a3df-c9cb18b3afa5","Type":"ContainerStarted","Data":"a7976d793a9d466aaf1b258cbc9fda24e901d4b9f87465ffa27c39e8bf42a907"} Dec 02 09:44:38 crc kubenswrapper[4781]: I1202 09:44:38.111510 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67cff9c6-hg2l6" event={"ID":"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e","Type":"ContainerStarted","Data":"48b12156148830c711ba4ea9c2a408455c26ae2b9baec0f6f03c6f3d2a1699e3"} Dec 02 09:44:38 crc kubenswrapper[4781]: I1202 09:44:38.111544 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67cff9c6-hg2l6" event={"ID":"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e","Type":"ContainerStarted","Data":"e4f8d80de8d7fdd7aed53bd0df8818adc65f193e5a52808c1a72181a16906549"} Dec 02 09:44:38 crc kubenswrapper[4781]: I1202 09:44:38.117558 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sqn2b" event={"ID":"855c82ea-ccc1-4981-b82d-b2d9aac387a6","Type":"ContainerStarted","Data":"c3513b69cc4c77c6c72076cad2760bf16cc2fb9525a398d12abefcdfdfd44aa7"} Dec 02 09:44:38 crc kubenswrapper[4781]: I1202 09:44:38.117614 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sqn2b" event={"ID":"855c82ea-ccc1-4981-b82d-b2d9aac387a6","Type":"ContainerStarted","Data":"4646deb2fd10ccd1e2e86ecc0b17eba4d0469f3a498b8e93a1945d4caf5baea3"} Dec 02 09:44:38 crc kubenswrapper[4781]: I1202 09:44:38.131236 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6856678494-4cprv" podStartSLOduration=41.728869881 podStartE2EDuration="45.131220199s" podCreationTimestamp="2025-12-02 09:43:53 +0000 UTC" firstStartedPulling="2025-12-02 09:44:33.371748764 +0000 UTC m=+1436.195622653" lastFinishedPulling="2025-12-02 09:44:36.774099092 +0000 UTC m=+1439.597972971" observedRunningTime="2025-12-02 09:44:38.126189665 +0000 UTC m=+1440.950063564" watchObservedRunningTime="2025-12-02 09:44:38.131220199 +0000 UTC m=+1440.955094078" Dec 02 09:44:38 crc kubenswrapper[4781]: I1202 09:44:38.155784 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-sqn2b" podStartSLOduration=4.155762162 podStartE2EDuration="4.155762162s" podCreationTimestamp="2025-12-02 09:44:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:44:38.149206008 +0000 UTC m=+1440.973079887" watchObservedRunningTime="2025-12-02 09:44:38.155762162 +0000 UTC m=+1440.979636051" Dec 02 09:44:38 crc kubenswrapper[4781]: I1202 09:44:38.178690 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-67cff9c6-hg2l6" podStartSLOduration=41.777087176 podStartE2EDuration="45.178670482s" podCreationTimestamp="2025-12-02 09:43:53 +0000 UTC" firstStartedPulling="2025-12-02 09:44:33.376427369 +0000 UTC m=+1436.200301248" lastFinishedPulling="2025-12-02 09:44:36.778010675 +0000 UTC m=+1439.601884554" observedRunningTime="2025-12-02 09:44:38.170471963 +0000 UTC m=+1440.994345852" watchObservedRunningTime="2025-12-02 09:44:38.178670482 +0000 UTC m=+1441.002544361" Dec 02 09:44:39 crc kubenswrapper[4781]: I1202 09:44:39.136800 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c98b4bc6-c086-4663-b794-92a36b0da2ba","Type":"ContainerStarted","Data":"d23653b0777bfb8823014de1063f3473f00585ce26d88d96ba403826a2ce3ec5"} Dec 02 09:44:40 crc kubenswrapper[4781]: I1202 09:44:40.476620 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qmr7s"] Dec 02 09:44:40 crc kubenswrapper[4781]: E1202 09:44:40.477856 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6efb5424-156a-4d85-865b-f057a1bdf098" containerName="init" Dec 02 09:44:40 crc kubenswrapper[4781]: I1202 09:44:40.477886 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6efb5424-156a-4d85-865b-f057a1bdf098" containerName="init" Dec 02 09:44:40 crc kubenswrapper[4781]: E1202 09:44:40.478014 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6efb5424-156a-4d85-865b-f057a1bdf098" containerName="dnsmasq-dns" Dec 02 09:44:40 crc kubenswrapper[4781]: I1202 09:44:40.478025 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6efb5424-156a-4d85-865b-f057a1bdf098" containerName="dnsmasq-dns" Dec 02 09:44:40 crc kubenswrapper[4781]: I1202 09:44:40.478468 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6efb5424-156a-4d85-865b-f057a1bdf098" containerName="dnsmasq-dns" Dec 02 09:44:40 crc kubenswrapper[4781]: I1202 09:44:40.480721 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmr7s" Dec 02 09:44:40 crc kubenswrapper[4781]: I1202 09:44:40.488590 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qmr7s"] Dec 02 09:44:40 crc kubenswrapper[4781]: I1202 09:44:40.588542 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1554dce-519d-4004-a1a3-a1c9072dc609-catalog-content\") pod \"community-operators-qmr7s\" (UID: \"c1554dce-519d-4004-a1a3-a1c9072dc609\") " pod="openshift-marketplace/community-operators-qmr7s" Dec 02 09:44:40 crc kubenswrapper[4781]: I1202 09:44:40.588781 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1554dce-519d-4004-a1a3-a1c9072dc609-utilities\") pod \"community-operators-qmr7s\" (UID: \"c1554dce-519d-4004-a1a3-a1c9072dc609\") " pod="openshift-marketplace/community-operators-qmr7s" Dec 02 09:44:40 crc kubenswrapper[4781]: I1202 09:44:40.588809 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vwxm\" (UniqueName: \"kubernetes.io/projected/c1554dce-519d-4004-a1a3-a1c9072dc609-kube-api-access-7vwxm\") pod \"community-operators-qmr7s\" (UID: \"c1554dce-519d-4004-a1a3-a1c9072dc609\") " pod="openshift-marketplace/community-operators-qmr7s" Dec 02 09:44:40 crc kubenswrapper[4781]: I1202 09:44:40.690470 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1554dce-519d-4004-a1a3-a1c9072dc609-catalog-content\") pod \"community-operators-qmr7s\" (UID: \"c1554dce-519d-4004-a1a3-a1c9072dc609\") " pod="openshift-marketplace/community-operators-qmr7s" Dec 02 09:44:40 crc kubenswrapper[4781]: I1202 09:44:40.690529 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1554dce-519d-4004-a1a3-a1c9072dc609-utilities\") pod \"community-operators-qmr7s\" (UID: \"c1554dce-519d-4004-a1a3-a1c9072dc609\") " pod="openshift-marketplace/community-operators-qmr7s" Dec 02 09:44:40 crc kubenswrapper[4781]: I1202 09:44:40.690559 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vwxm\" (UniqueName: \"kubernetes.io/projected/c1554dce-519d-4004-a1a3-a1c9072dc609-kube-api-access-7vwxm\") pod \"community-operators-qmr7s\" (UID: \"c1554dce-519d-4004-a1a3-a1c9072dc609\") " pod="openshift-marketplace/community-operators-qmr7s" Dec 02 09:44:40 crc kubenswrapper[4781]: I1202 09:44:40.691455 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1554dce-519d-4004-a1a3-a1c9072dc609-catalog-content\") pod \"community-operators-qmr7s\" (UID: \"c1554dce-519d-4004-a1a3-a1c9072dc609\") " pod="openshift-marketplace/community-operators-qmr7s" Dec 02 09:44:40 crc kubenswrapper[4781]: I1202 09:44:40.691767 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1554dce-519d-4004-a1a3-a1c9072dc609-utilities\") pod \"community-operators-qmr7s\" (UID: \"c1554dce-519d-4004-a1a3-a1c9072dc609\") " pod="openshift-marketplace/community-operators-qmr7s" Dec 02 09:44:40 crc kubenswrapper[4781]: I1202 09:44:40.720860 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vwxm\" (UniqueName: \"kubernetes.io/projected/c1554dce-519d-4004-a1a3-a1c9072dc609-kube-api-access-7vwxm\") pod \"community-operators-qmr7s\" (UID: \"c1554dce-519d-4004-a1a3-a1c9072dc609\") " pod="openshift-marketplace/community-operators-qmr7s" Dec 02 09:44:40 crc kubenswrapper[4781]: I1202 09:44:40.812020 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmr7s" Dec 02 09:44:41 crc kubenswrapper[4781]: I1202 09:44:41.191573 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-879pw" podUID="6efb5424-156a-4d85-865b-f057a1bdf098" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Dec 02 09:44:41 crc kubenswrapper[4781]: I1202 09:44:41.428478 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qmr7s"] Dec 02 09:44:43 crc kubenswrapper[4781]: I1202 09:44:43.989700 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:44:43 crc kubenswrapper[4781]: I1202 09:44:43.990352 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:44:44 crc kubenswrapper[4781]: I1202 09:44:44.064553 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6856678494-4cprv" Dec 02 09:44:44 crc kubenswrapper[4781]: I1202 09:44:44.064613 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6856678494-4cprv" Dec 02 09:44:44 crc kubenswrapper[4781]: I1202 09:44:44.179140 4781 generic.go:334] "Generic (PLEG): container finished" podID="ea3e7049-869e-4822-a11e-9eb2df6e3eeb" containerID="4edc7af81473d46dc2e001614f6757ba9772e247b0e410a508c19e71f50f260e" exitCode=0 Dec 02 09:44:44 crc kubenswrapper[4781]: I1202 09:44:44.179196 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-t6bqc" event={"ID":"ea3e7049-869e-4822-a11e-9eb2df6e3eeb","Type":"ContainerDied","Data":"4edc7af81473d46dc2e001614f6757ba9772e247b0e410a508c19e71f50f260e"} Dec 02 09:44:44 crc kubenswrapper[4781]: I1202 09:44:44.180892 4781 generic.go:334] "Generic (PLEG): container finished" podID="855c82ea-ccc1-4981-b82d-b2d9aac387a6" containerID="c3513b69cc4c77c6c72076cad2760bf16cc2fb9525a398d12abefcdfdfd44aa7" exitCode=0 Dec 02 09:44:44 crc kubenswrapper[4781]: I1202 09:44:44.180916 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sqn2b" event={"ID":"855c82ea-ccc1-4981-b82d-b2d9aac387a6","Type":"ContainerDied","Data":"c3513b69cc4c77c6c72076cad2760bf16cc2fb9525a398d12abefcdfdfd44aa7"} Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.189797 4781 generic.go:334] "Generic (PLEG): container finished" podID="af2e4c8d-f431-487f-8b70-a2b6e6ee6000" containerID="544d424dc3bd43c5b3a55a41876f880fe1ebd6d93cf6b42073aad9a19a8c465a" exitCode=0 Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.189938 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-m7kpn" event={"ID":"af2e4c8d-f431-487f-8b70-a2b6e6ee6000","Type":"ContainerDied","Data":"544d424dc3bd43c5b3a55a41876f880fe1ebd6d93cf6b42073aad9a19a8c465a"} Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.590302 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.679080 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-config-data\") pod \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.679136 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-combined-ca-bundle\") pod \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.679166 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69b6s\" (UniqueName: \"kubernetes.io/projected/855c82ea-ccc1-4981-b82d-b2d9aac387a6-kube-api-access-69b6s\") pod \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.679213 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-scripts\") pod \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.679352 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-fernet-keys\") pod \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.679387 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-credential-keys\") pod \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\" (UID: \"855c82ea-ccc1-4981-b82d-b2d9aac387a6\") " Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.684906 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "855c82ea-ccc1-4981-b82d-b2d9aac387a6" (UID: "855c82ea-ccc1-4981-b82d-b2d9aac387a6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.686379 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-scripts" (OuterVolumeSpecName: "scripts") pod "855c82ea-ccc1-4981-b82d-b2d9aac387a6" (UID: "855c82ea-ccc1-4981-b82d-b2d9aac387a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.688173 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/855c82ea-ccc1-4981-b82d-b2d9aac387a6-kube-api-access-69b6s" (OuterVolumeSpecName: "kube-api-access-69b6s") pod "855c82ea-ccc1-4981-b82d-b2d9aac387a6" (UID: "855c82ea-ccc1-4981-b82d-b2d9aac387a6"). InnerVolumeSpecName "kube-api-access-69b6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.694081 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "855c82ea-ccc1-4981-b82d-b2d9aac387a6" (UID: "855c82ea-ccc1-4981-b82d-b2d9aac387a6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.704726 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-config-data" (OuterVolumeSpecName: "config-data") pod "855c82ea-ccc1-4981-b82d-b2d9aac387a6" (UID: "855c82ea-ccc1-4981-b82d-b2d9aac387a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.713195 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "855c82ea-ccc1-4981-b82d-b2d9aac387a6" (UID: "855c82ea-ccc1-4981-b82d-b2d9aac387a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.785664 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.785740 4781 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.785753 4781 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.785765 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.785775 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855c82ea-ccc1-4981-b82d-b2d9aac387a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.785809 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69b6s\" (UniqueName: \"kubernetes.io/projected/855c82ea-ccc1-4981-b82d-b2d9aac387a6-kube-api-access-69b6s\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.834167 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-t6bqc" Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.989207 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zflkj\" (UniqueName: \"kubernetes.io/projected/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-kube-api-access-zflkj\") pod \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\" (UID: \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\") " Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.989535 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-combined-ca-bundle\") pod \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\" (UID: \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\") " Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.989634 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-db-sync-config-data\") pod \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\" (UID: \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\") " Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.989766 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-config-data\") pod \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\" (UID: \"ea3e7049-869e-4822-a11e-9eb2df6e3eeb\") " Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.996556 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ea3e7049-869e-4822-a11e-9eb2df6e3eeb" (UID: "ea3e7049-869e-4822-a11e-9eb2df6e3eeb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:45 crc kubenswrapper[4781]: I1202 09:44:45.997118 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-kube-api-access-zflkj" (OuterVolumeSpecName: "kube-api-access-zflkj") pod "ea3e7049-869e-4822-a11e-9eb2df6e3eeb" (UID: "ea3e7049-869e-4822-a11e-9eb2df6e3eeb"). InnerVolumeSpecName "kube-api-access-zflkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.017684 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea3e7049-869e-4822-a11e-9eb2df6e3eeb" (UID: "ea3e7049-869e-4822-a11e-9eb2df6e3eeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.035805 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-config-data" (OuterVolumeSpecName: "config-data") pod "ea3e7049-869e-4822-a11e-9eb2df6e3eeb" (UID: "ea3e7049-869e-4822-a11e-9eb2df6e3eeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.091644 4781 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.091681 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.091693 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zflkj\" (UniqueName: \"kubernetes.io/projected/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-kube-api-access-zflkj\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.091701 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3e7049-869e-4822-a11e-9eb2df6e3eeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.202249 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-t6bqc" event={"ID":"ea3e7049-869e-4822-a11e-9eb2df6e3eeb","Type":"ContainerDied","Data":"28468699047a557e85e4629b429c8ba7d23c6ed5f5533938aeea874b6c823b11"} Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.203276 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28468699047a557e85e4629b429c8ba7d23c6ed5f5533938aeea874b6c823b11" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.203448 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-t6bqc" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.209919 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c98b4bc6-c086-4663-b794-92a36b0da2ba","Type":"ContainerStarted","Data":"0835ce989dbc0e513f8ecff2e64616304cb06d329a9b73a5d2ae1312c75e2ac7"} Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.211947 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sqn2b" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.211973 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sqn2b" event={"ID":"855c82ea-ccc1-4981-b82d-b2d9aac387a6","Type":"ContainerDied","Data":"4646deb2fd10ccd1e2e86ecc0b17eba4d0469f3a498b8e93a1945d4caf5baea3"} Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.212030 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4646deb2fd10ccd1e2e86ecc0b17eba4d0469f3a498b8e93a1945d4caf5baea3" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.215204 4781 generic.go:334] "Generic (PLEG): container finished" podID="c1554dce-519d-4004-a1a3-a1c9072dc609" containerID="e55dc565488cf739e9cfcd3e7447e6185be476f3bb9dea571d0363f6e56bfaa6" exitCode=0 Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.215281 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmr7s" event={"ID":"c1554dce-519d-4004-a1a3-a1c9072dc609","Type":"ContainerDied","Data":"e55dc565488cf739e9cfcd3e7447e6185be476f3bb9dea571d0363f6e56bfaa6"} Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.215315 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmr7s" event={"ID":"c1554dce-519d-4004-a1a3-a1c9072dc609","Type":"ContainerStarted","Data":"e5b8a995cd9245d1b0c8a13b50ccc04f956126416f4da3c8c7ea35ddb424593f"} Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.394479 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cbfc4ddfb-kljg5"] Dec 02 09:44:46 crc kubenswrapper[4781]: E1202 09:44:46.394889 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3e7049-869e-4822-a11e-9eb2df6e3eeb" containerName="glance-db-sync" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.394957 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3e7049-869e-4822-a11e-9eb2df6e3eeb" containerName="glance-db-sync" Dec 02 09:44:46 crc kubenswrapper[4781]: E1202 09:44:46.394992 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855c82ea-ccc1-4981-b82d-b2d9aac387a6" containerName="keystone-bootstrap" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.395000 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="855c82ea-ccc1-4981-b82d-b2d9aac387a6" containerName="keystone-bootstrap" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.395177 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="855c82ea-ccc1-4981-b82d-b2d9aac387a6" containerName="keystone-bootstrap" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.395193 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3e7049-869e-4822-a11e-9eb2df6e3eeb" containerName="glance-db-sync" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.395696 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.398214 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.404610 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.405053 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.405241 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.405479 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.405673 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q5lzd" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.414244 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cbfc4ddfb-kljg5"] Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.502478 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2p6m\" (UniqueName: \"kubernetes.io/projected/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-kube-api-access-l2p6m\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.502814 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-credential-keys\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.502859 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-scripts\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.502878 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-config-data\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.502942 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-fernet-keys\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.502968 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-combined-ca-bundle\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.503013 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-public-tls-certs\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.503054 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-internal-tls-certs\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.606773 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2p6m\" (UniqueName: \"kubernetes.io/projected/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-kube-api-access-l2p6m\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.606823 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-credential-keys\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.606868 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-scripts\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.606881 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-config-data\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.606957 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-fernet-keys\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.606978 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-combined-ca-bundle\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.607009 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-public-tls-certs\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.607035 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-internal-tls-certs\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.613636 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-internal-tls-certs\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.625702 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-config-data\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.633449 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-credential-keys\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.634587 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-public-tls-certs\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.635444 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-scripts\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.640561 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-fernet-keys\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.643686 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-combined-ca-bundle\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.712276 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2p6m\" (UniqueName: \"kubernetes.io/projected/e2f1c0db-2cf8-4e49-b1cf-8cb27f997927-kube-api-access-l2p6m\") pod \"keystone-cbfc4ddfb-kljg5\" (UID: \"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927\") " pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.731404 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.829940 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-m7kpn" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.850669 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2wf99"] Dec 02 09:44:46 crc kubenswrapper[4781]: E1202 09:44:46.851092 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2e4c8d-f431-487f-8b70-a2b6e6ee6000" containerName="placement-db-sync" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.851105 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2e4c8d-f431-487f-8b70-a2b6e6ee6000" containerName="placement-db-sync" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.851277 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="af2e4c8d-f431-487f-8b70-a2b6e6ee6000" containerName="placement-db-sync" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.852173 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.865936 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2wf99"] Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.914417 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-config-data\") pod \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.914490 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-combined-ca-bundle\") pod \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.914511 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-scripts\") pod \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.914680 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4hqp\" (UniqueName: \"kubernetes.io/projected/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-kube-api-access-j4hqp\") pod \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.914723 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-logs\") pod \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\" (UID: \"af2e4c8d-f431-487f-8b70-a2b6e6ee6000\") " Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.915824 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-logs" (OuterVolumeSpecName: "logs") pod "af2e4c8d-f431-487f-8b70-a2b6e6ee6000" (UID: "af2e4c8d-f431-487f-8b70-a2b6e6ee6000"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.927477 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-kube-api-access-j4hqp" (OuterVolumeSpecName: "kube-api-access-j4hqp") pod "af2e4c8d-f431-487f-8b70-a2b6e6ee6000" (UID: "af2e4c8d-f431-487f-8b70-a2b6e6ee6000"). InnerVolumeSpecName "kube-api-access-j4hqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.939112 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-scripts" (OuterVolumeSpecName: "scripts") pod "af2e4c8d-f431-487f-8b70-a2b6e6ee6000" (UID: "af2e4c8d-f431-487f-8b70-a2b6e6ee6000"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:46 crc kubenswrapper[4781]: I1202 09:44:46.996964 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af2e4c8d-f431-487f-8b70-a2b6e6ee6000" (UID: "af2e4c8d-f431-487f-8b70-a2b6e6ee6000"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.003990 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-config-data" (OuterVolumeSpecName: "config-data") pod "af2e4c8d-f431-487f-8b70-a2b6e6ee6000" (UID: "af2e4c8d-f431-487f-8b70-a2b6e6ee6000"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.019239 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.019349 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.019386 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.019457 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfcs\" (UniqueName: \"kubernetes.io/projected/74200d1e-84a7-4f23-b858-e24dcb955dbb-kube-api-access-wlfcs\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.019486 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.019543 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-config\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.019610 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.019626 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.019639 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.019650 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.019661 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4hqp\" (UniqueName: \"kubernetes.io/projected/af2e4c8d-f431-487f-8b70-a2b6e6ee6000-kube-api-access-j4hqp\") on node \"crc\" DevicePath \"\"" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.121172 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.121243 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.121330 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlfcs\" (UniqueName: \"kubernetes.io/projected/74200d1e-84a7-4f23-b858-e24dcb955dbb-kube-api-access-wlfcs\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.121364 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.121419 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-config\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.121463 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.122405 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.122421 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.122709 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.123077 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.123452 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-config\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.149546 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlfcs\" (UniqueName: \"kubernetes.io/projected/74200d1e-84a7-4f23-b858-e24dcb955dbb-kube-api-access-wlfcs\") pod \"dnsmasq-dns-785d8bcb8c-2wf99\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.177743 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.279294 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-m7kpn" event={"ID":"af2e4c8d-f431-487f-8b70-a2b6e6ee6000","Type":"ContainerDied","Data":"ad2f700b34454b5aef5f81ba32cbc70d4ba40548ab70bb0da5e5c6f8388bfdbd"} Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.279348 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad2f700b34454b5aef5f81ba32cbc70d4ba40548ab70bb0da5e5c6f8388bfdbd" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.279456 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-m7kpn" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.350637 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-69459794d8-ph7dr"] Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.356259 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.361509 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8p898" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.361808 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.362291 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.362476 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.364062 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.373407 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-69459794d8-ph7dr"] Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.427692 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd38c16-6eab-4f3e-9c4d-294b240fa154-config-data\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.428061 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abd38c16-6eab-4f3e-9c4d-294b240fa154-logs\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.428113 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd38c16-6eab-4f3e-9c4d-294b240fa154-internal-tls-certs\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.428159 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd38c16-6eab-4f3e-9c4d-294b240fa154-public-tls-certs\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.430071 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abd38c16-6eab-4f3e-9c4d-294b240fa154-scripts\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.430129 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd38c16-6eab-4f3e-9c4d-294b240fa154-combined-ca-bundle\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.430182 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6w2f\" (UniqueName: \"kubernetes.io/projected/abd38c16-6eab-4f3e-9c4d-294b240fa154-kube-api-access-d6w2f\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.497337 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cbfc4ddfb-kljg5"] Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.531203 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abd38c16-6eab-4f3e-9c4d-294b240fa154-scripts\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.531252 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd38c16-6eab-4f3e-9c4d-294b240fa154-combined-ca-bundle\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.531283 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6w2f\" (UniqueName: \"kubernetes.io/projected/abd38c16-6eab-4f3e-9c4d-294b240fa154-kube-api-access-d6w2f\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.531303 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd38c16-6eab-4f3e-9c4d-294b240fa154-config-data\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.531328 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abd38c16-6eab-4f3e-9c4d-294b240fa154-logs\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.531356 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd38c16-6eab-4f3e-9c4d-294b240fa154-internal-tls-certs\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.531393 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd38c16-6eab-4f3e-9c4d-294b240fa154-public-tls-certs\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.543047 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd38c16-6eab-4f3e-9c4d-294b240fa154-public-tls-certs\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.545030 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abd38c16-6eab-4f3e-9c4d-294b240fa154-logs\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.559133 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abd38c16-6eab-4f3e-9c4d-294b240fa154-scripts\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.559351 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd38c16-6eab-4f3e-9c4d-294b240fa154-config-data\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.559500 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd38c16-6eab-4f3e-9c4d-294b240fa154-combined-ca-bundle\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.567586 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd38c16-6eab-4f3e-9c4d-294b240fa154-internal-tls-certs\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.574122 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6w2f\" (UniqueName: \"kubernetes.io/projected/abd38c16-6eab-4f3e-9c4d-294b240fa154-kube-api-access-d6w2f\") pod \"placement-69459794d8-ph7dr\" (UID: \"abd38c16-6eab-4f3e-9c4d-294b240fa154\") " pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.591519 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.593877 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.598971 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-shwpc" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.599701 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.600019 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.615915 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.689518 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.735276 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.735319 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-logs\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.735374 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.735409 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t94m\" (UniqueName: \"kubernetes.io/projected/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-kube-api-access-4t94m\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.735578 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-config-data\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.735799 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.735893 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-scripts\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.838912 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.838970 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-logs\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.839016 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.839043 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t94m\" (UniqueName: \"kubernetes.io/projected/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-kube-api-access-4t94m\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.839083 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-config-data\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.839122 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.839154 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-scripts\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.840594 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-logs\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.840979 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.840990 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.846666 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-scripts\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.847154 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.848540 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-config-data\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.865204 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t94m\" (UniqueName: \"kubernetes.io/projected/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-kube-api-access-4t94m\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.884710 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2wf99"] Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.887096 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " pod="openstack/glance-default-external-api-0" Dec 02 09:44:47 crc kubenswrapper[4781]: I1202 09:44:47.992964 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.001905 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.005312 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.009491 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.043150 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.043297 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.043337 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2be1f200-d68d-47e5-889a-fea41af06a96-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.043352 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.043379 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2be1f200-d68d-47e5-889a-fea41af06a96-logs\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.043410 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.043446 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dk52\" (UniqueName: \"kubernetes.io/projected/2be1f200-d68d-47e5-889a-fea41af06a96-kube-api-access-9dk52\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.089434 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.149112 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.152250 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2be1f200-d68d-47e5-889a-fea41af06a96-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.152284 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.152340 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2be1f200-d68d-47e5-889a-fea41af06a96-logs\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.152402 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.152469 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dk52\" (UniqueName: \"kubernetes.io/projected/2be1f200-d68d-47e5-889a-fea41af06a96-kube-api-access-9dk52\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.152645 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.153159 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2be1f200-d68d-47e5-889a-fea41af06a96-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.153236 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.153388 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2be1f200-d68d-47e5-889a-fea41af06a96-logs\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.155662 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.162370 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.172704 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.177451 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dk52\" (UniqueName: \"kubernetes.io/projected/2be1f200-d68d-47e5-889a-fea41af06a96-kube-api-access-9dk52\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.183580 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.310573 4781 generic.go:334] "Generic (PLEG): container finished" podID="74200d1e-84a7-4f23-b858-e24dcb955dbb" containerID="a6ebccae71e14ab79a993a9b2755042092cbcac23dfc21c29f989ad2a993c7fd" exitCode=0 Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.310675 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" event={"ID":"74200d1e-84a7-4f23-b858-e24dcb955dbb","Type":"ContainerDied","Data":"a6ebccae71e14ab79a993a9b2755042092cbcac23dfc21c29f989ad2a993c7fd"} Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.311251 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" event={"ID":"74200d1e-84a7-4f23-b858-e24dcb955dbb","Type":"ContainerStarted","Data":"628e0066d922a147919476fa6a3bdc58a006c2c170b8f46068319d6a19cceba0"} Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.317085 4781 generic.go:334] "Generic (PLEG): container finished" podID="c1554dce-519d-4004-a1a3-a1c9072dc609" containerID="bd12bd1057cb09ad2c1b1410b87d8c76e6bbad11640a2264957a1c7ab7c45be0" exitCode=0 Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.317261 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmr7s" event={"ID":"c1554dce-519d-4004-a1a3-a1c9072dc609","Type":"ContainerDied","Data":"bd12bd1057cb09ad2c1b1410b87d8c76e6bbad11640a2264957a1c7ab7c45be0"} Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.320319 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cbfc4ddfb-kljg5" event={"ID":"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927","Type":"ContainerStarted","Data":"87b30a952d10f9bab2222a6e6a17096956bd2f9761c37ae3040838cd117bbd3d"} Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.320358 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cbfc4ddfb-kljg5" event={"ID":"e2f1c0db-2cf8-4e49-b1cf-8cb27f997927","Type":"ContainerStarted","Data":"30b5229352462e296b05ed244d0cffa8c3aae58be5fe7717c277b561d2dbf8aa"} Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.320878 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.324271 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-69459794d8-ph7dr"] Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.353671 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 09:44:48 crc kubenswrapper[4781]: I1202 09:44:48.393986 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cbfc4ddfb-kljg5" podStartSLOduration=2.393956855 podStartE2EDuration="2.393956855s" podCreationTimestamp="2025-12-02 09:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:44:48.37986394 +0000 UTC m=+1451.203737819" watchObservedRunningTime="2025-12-02 09:44:48.393956855 +0000 UTC m=+1451.217830734" Dec 02 09:44:48 crc kubenswrapper[4781]: W1202 09:44:48.428270 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabd38c16_6eab_4f3e_9c4d_294b240fa154.slice/crio-17d29df1f43d2703435a16dc19e9ccfa5ac4e4e13fc51e327286459a4060086b WatchSource:0}: Error finding container 17d29df1f43d2703435a16dc19e9ccfa5ac4e4e13fc51e327286459a4060086b: Status 404 returned error can't find the container with id 17d29df1f43d2703435a16dc19e9ccfa5ac4e4e13fc51e327286459a4060086b Dec 02 09:44:49 crc kubenswrapper[4781]: I1202 09:44:49.051849 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:44:49 crc kubenswrapper[4781]: I1202 09:44:49.264029 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:44:49 crc kubenswrapper[4781]: W1202 09:44:49.265124 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2be1f200_d68d_47e5_889a_fea41af06a96.slice/crio-4cef4be967f2b0737bf78b7cddbefb2388e3d99e077f7b50ec1899d640de5e78 WatchSource:0}: Error finding container 4cef4be967f2b0737bf78b7cddbefb2388e3d99e077f7b50ec1899d640de5e78: Status 404 returned error can't find the container with id 4cef4be967f2b0737bf78b7cddbefb2388e3d99e077f7b50ec1899d640de5e78 Dec 02 09:44:49 crc kubenswrapper[4781]: I1202 09:44:49.339362 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c","Type":"ContainerStarted","Data":"5f1428fb834c577783ee679b320b070a178a623cfab3c936c86de371570a64f0"} Dec 02 09:44:49 crc kubenswrapper[4781]: I1202 09:44:49.341589 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" event={"ID":"74200d1e-84a7-4f23-b858-e24dcb955dbb","Type":"ContainerStarted","Data":"8d83e5fd74e7de7d8cc031d27e869372a76d4e0f82ffc074e799d9426b66906b"} Dec 02 09:44:49 crc kubenswrapper[4781]: I1202 09:44:49.342331 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:49 crc kubenswrapper[4781]: I1202 09:44:49.344748 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2be1f200-d68d-47e5-889a-fea41af06a96","Type":"ContainerStarted","Data":"4cef4be967f2b0737bf78b7cddbefb2388e3d99e077f7b50ec1899d640de5e78"} Dec 02 09:44:49 crc kubenswrapper[4781]: I1202 09:44:49.351412 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69459794d8-ph7dr" event={"ID":"abd38c16-6eab-4f3e-9c4d-294b240fa154","Type":"ContainerStarted","Data":"e2659eb9f29fcdb1277efd02081acc7fa1cd92d48347ff58efd6e9e4e8ea5cf3"} Dec 02 09:44:49 crc kubenswrapper[4781]: I1202 09:44:49.351441 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69459794d8-ph7dr" event={"ID":"abd38c16-6eab-4f3e-9c4d-294b240fa154","Type":"ContainerStarted","Data":"17d29df1f43d2703435a16dc19e9ccfa5ac4e4e13fc51e327286459a4060086b"} Dec 02 09:44:49 crc kubenswrapper[4781]: I1202 09:44:49.368806 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" podStartSLOduration=3.368786664 podStartE2EDuration="3.368786664s" podCreationTimestamp="2025-12-02 09:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:44:49.366047701 +0000 UTC m=+1452.189921590" watchObservedRunningTime="2025-12-02 09:44:49.368786664 +0000 UTC m=+1452.192660543" Dec 02 09:44:49 crc kubenswrapper[4781]: I1202 09:44:49.989981 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:44:50 crc kubenswrapper[4781]: I1202 09:44:50.056672 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:44:52 crc kubenswrapper[4781]: I1202 09:44:52.403035 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c","Type":"ContainerStarted","Data":"9a77cad9482f507ac66337d55bc6129c12301e8c5a561bba433cf1893855cfb9"} Dec 02 09:44:52 crc kubenswrapper[4781]: I1202 09:44:52.405575 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2be1f200-d68d-47e5-889a-fea41af06a96","Type":"ContainerStarted","Data":"5c14a0948ae197de8effd63c9e112116d56984890a8ece0da5345e5c043a9d0b"} Dec 02 09:44:52 crc kubenswrapper[4781]: I1202 09:44:52.412092 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69459794d8-ph7dr" event={"ID":"abd38c16-6eab-4f3e-9c4d-294b240fa154","Type":"ContainerStarted","Data":"b7edc1eb380249255c21526a46106f0484b157dc3d4a9ee8eb5f5942ea76f881"} Dec 02 09:44:52 crc kubenswrapper[4781]: I1202 09:44:52.412251 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:52 crc kubenswrapper[4781]: I1202 09:44:52.438352 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-69459794d8-ph7dr" podStartSLOduration=5.438331389 podStartE2EDuration="5.438331389s" podCreationTimestamp="2025-12-02 09:44:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:44:52.432806682 +0000 UTC m=+1455.256680561" watchObservedRunningTime="2025-12-02 09:44:52.438331389 +0000 UTC m=+1455.262205258" Dec 02 09:44:53 crc kubenswrapper[4781]: I1202 09:44:53.426811 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c","Type":"ContainerStarted","Data":"f0edb10719d95ee9c62291583c48aaf88a403126d29381146b6d778cb697bb15"} Dec 02 09:44:53 crc kubenswrapper[4781]: I1202 09:44:53.427155 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" containerName="glance-log" containerID="cri-o://9a77cad9482f507ac66337d55bc6129c12301e8c5a561bba433cf1893855cfb9" gracePeriod=30 Dec 02 09:44:53 crc kubenswrapper[4781]: I1202 09:44:53.427433 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" containerName="glance-httpd" containerID="cri-o://f0edb10719d95ee9c62291583c48aaf88a403126d29381146b6d778cb697bb15" gracePeriod=30 Dec 02 09:44:53 crc kubenswrapper[4781]: I1202 09:44:53.432246 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2be1f200-d68d-47e5-889a-fea41af06a96","Type":"ContainerStarted","Data":"32e0effa1f7e2a896eed743c86a8d58a568201bf1d4a082b7b231c98380bdf94"} Dec 02 09:44:53 crc kubenswrapper[4781]: I1202 09:44:53.432426 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2be1f200-d68d-47e5-889a-fea41af06a96" containerName="glance-log" containerID="cri-o://5c14a0948ae197de8effd63c9e112116d56984890a8ece0da5345e5c043a9d0b" gracePeriod=30 Dec 02 09:44:53 crc kubenswrapper[4781]: I1202 09:44:53.432546 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2be1f200-d68d-47e5-889a-fea41af06a96" containerName="glance-httpd" containerID="cri-o://32e0effa1f7e2a896eed743c86a8d58a568201bf1d4a082b7b231c98380bdf94" gracePeriod=30 Dec 02 09:44:53 crc kubenswrapper[4781]: I1202 09:44:53.442638 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bq87j" event={"ID":"9418aedd-84eb-4ff9-89f8-831695e5471e","Type":"ContainerStarted","Data":"9dd76875a01226364271269d23abccb788961de68d9af76565e733a9a1496d2d"} Dec 02 09:44:53 crc kubenswrapper[4781]: I1202 09:44:53.443526 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:44:53 crc kubenswrapper[4781]: I1202 09:44:53.480468 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.480447508 podStartE2EDuration="7.480447508s" podCreationTimestamp="2025-12-02 09:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:44:53.476539754 +0000 UTC m=+1456.300413633" watchObservedRunningTime="2025-12-02 09:44:53.480447508 +0000 UTC m=+1456.304321387" Dec 02 09:44:53 crc kubenswrapper[4781]: I1202 09:44:53.490077 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.490054793 podStartE2EDuration="7.490054793s" podCreationTimestamp="2025-12-02 09:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:44:53.452959777 +0000 UTC m=+1456.276833676" watchObservedRunningTime="2025-12-02 09:44:53.490054793 +0000 UTC m=+1456.313928672" Dec 02 09:44:53 crc kubenswrapper[4781]: I1202 09:44:53.510135 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-bq87j" podStartSLOduration=3.9754779559999998 podStartE2EDuration="1m9.510112057s" podCreationTimestamp="2025-12-02 09:43:44 +0000 UTC" firstStartedPulling="2025-12-02 09:43:46.333642095 +0000 UTC m=+1389.157515974" lastFinishedPulling="2025-12-02 09:44:51.868276196 +0000 UTC m=+1454.692150075" observedRunningTime="2025-12-02 09:44:53.500539432 +0000 UTC m=+1456.324413331" watchObservedRunningTime="2025-12-02 09:44:53.510112057 +0000 UTC m=+1456.333985946" Dec 02 09:44:53 crc kubenswrapper[4781]: I1202 09:44:53.992183 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67cff9c6-hg2l6" podUID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Dec 02 09:44:54 crc kubenswrapper[4781]: I1202 09:44:54.067004 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6856678494-4cprv" podUID="226317c7-a6f4-43c5-a3df-c9cb18b3afa5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Dec 02 09:44:54 crc kubenswrapper[4781]: I1202 09:44:54.465946 4781 generic.go:334] "Generic (PLEG): container finished" podID="b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" containerID="f0edb10719d95ee9c62291583c48aaf88a403126d29381146b6d778cb697bb15" exitCode=0 Dec 02 09:44:54 crc kubenswrapper[4781]: I1202 09:44:54.466238 4781 generic.go:334] "Generic (PLEG): container finished" podID="b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" containerID="9a77cad9482f507ac66337d55bc6129c12301e8c5a561bba433cf1893855cfb9" exitCode=143 Dec 02 09:44:54 crc kubenswrapper[4781]: I1202 09:44:54.466304 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c","Type":"ContainerDied","Data":"f0edb10719d95ee9c62291583c48aaf88a403126d29381146b6d778cb697bb15"} Dec 02 09:44:54 crc kubenswrapper[4781]: I1202 09:44:54.466333 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c","Type":"ContainerDied","Data":"9a77cad9482f507ac66337d55bc6129c12301e8c5a561bba433cf1893855cfb9"} Dec 02 09:44:54 crc kubenswrapper[4781]: I1202 09:44:54.468889 4781 generic.go:334] "Generic (PLEG): container finished" podID="2be1f200-d68d-47e5-889a-fea41af06a96" containerID="32e0effa1f7e2a896eed743c86a8d58a568201bf1d4a082b7b231c98380bdf94" exitCode=0 Dec 02 09:44:54 crc kubenswrapper[4781]: I1202 09:44:54.468910 4781 generic.go:334] "Generic (PLEG): container finished" podID="2be1f200-d68d-47e5-889a-fea41af06a96" containerID="5c14a0948ae197de8effd63c9e112116d56984890a8ece0da5345e5c043a9d0b" exitCode=143 Dec 02 09:44:54 crc kubenswrapper[4781]: I1202 09:44:54.469019 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2be1f200-d68d-47e5-889a-fea41af06a96","Type":"ContainerDied","Data":"32e0effa1f7e2a896eed743c86a8d58a568201bf1d4a082b7b231c98380bdf94"} Dec 02 09:44:54 crc kubenswrapper[4781]: I1202 09:44:54.469046 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2be1f200-d68d-47e5-889a-fea41af06a96","Type":"ContainerDied","Data":"5c14a0948ae197de8effd63c9e112116d56984890a8ece0da5345e5c043a9d0b"} Dec 02 09:44:57 crc kubenswrapper[4781]: I1202 09:44:57.180015 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:44:57 crc kubenswrapper[4781]: I1202 09:44:57.248069 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-n7z2z"] Dec 02 09:44:57 crc kubenswrapper[4781]: I1202 09:44:57.248355 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" podUID="a3fc9b2b-26bf-4f7d-b052-c976f8584c43" containerName="dnsmasq-dns" containerID="cri-o://f8112c905f7061e6694008c35ca2b50d6689e96ec0f9b105c3091f7f17b92eac" gracePeriod=10 Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.138047 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb"] Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.139852 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.142374 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.143323 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.148352 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb"] Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.301180 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-secret-volume\") pod \"collect-profiles-29411145-vkbqb\" (UID: \"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.301555 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx89t\" (UniqueName: \"kubernetes.io/projected/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-kube-api-access-nx89t\") pod \"collect-profiles-29411145-vkbqb\" (UID: \"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.301654 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-config-volume\") pod \"collect-profiles-29411145-vkbqb\" (UID: \"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.309754 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hqlbj"] Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.320454 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqlbj" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.332536 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqlbj"] Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.403719 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-config-volume\") pod \"collect-profiles-29411145-vkbqb\" (UID: \"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.403890 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27107564-9a0b-4a63-894d-4e0960abd435-catalog-content\") pod \"redhat-marketplace-hqlbj\" (UID: \"27107564-9a0b-4a63-894d-4e0960abd435\") " pod="openshift-marketplace/redhat-marketplace-hqlbj" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.403935 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-secret-volume\") pod \"collect-profiles-29411145-vkbqb\" (UID: \"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.403976 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27107564-9a0b-4a63-894d-4e0960abd435-utilities\") pod \"redhat-marketplace-hqlbj\" (UID: \"27107564-9a0b-4a63-894d-4e0960abd435\") " pod="openshift-marketplace/redhat-marketplace-hqlbj" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.404017 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nddw\" (UniqueName: \"kubernetes.io/projected/27107564-9a0b-4a63-894d-4e0960abd435-kube-api-access-2nddw\") pod \"redhat-marketplace-hqlbj\" (UID: \"27107564-9a0b-4a63-894d-4e0960abd435\") " pod="openshift-marketplace/redhat-marketplace-hqlbj" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.404115 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx89t\" (UniqueName: \"kubernetes.io/projected/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-kube-api-access-nx89t\") pod \"collect-profiles-29411145-vkbqb\" (UID: \"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.405275 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-config-volume\") pod \"collect-profiles-29411145-vkbqb\" (UID: \"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.411278 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-secret-volume\") pod \"collect-profiles-29411145-vkbqb\" (UID: \"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.425511 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx89t\" (UniqueName: \"kubernetes.io/projected/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-kube-api-access-nx89t\") pod \"collect-profiles-29411145-vkbqb\" (UID: \"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.472385 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.505252 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nddw\" (UniqueName: \"kubernetes.io/projected/27107564-9a0b-4a63-894d-4e0960abd435-kube-api-access-2nddw\") pod \"redhat-marketplace-hqlbj\" (UID: \"27107564-9a0b-4a63-894d-4e0960abd435\") " pod="openshift-marketplace/redhat-marketplace-hqlbj" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.505401 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27107564-9a0b-4a63-894d-4e0960abd435-catalog-content\") pod \"redhat-marketplace-hqlbj\" (UID: \"27107564-9a0b-4a63-894d-4e0960abd435\") " pod="openshift-marketplace/redhat-marketplace-hqlbj" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.505434 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27107564-9a0b-4a63-894d-4e0960abd435-utilities\") pod \"redhat-marketplace-hqlbj\" (UID: \"27107564-9a0b-4a63-894d-4e0960abd435\") " pod="openshift-marketplace/redhat-marketplace-hqlbj" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.505826 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27107564-9a0b-4a63-894d-4e0960abd435-utilities\") pod \"redhat-marketplace-hqlbj\" (UID: \"27107564-9a0b-4a63-894d-4e0960abd435\") " pod="openshift-marketplace/redhat-marketplace-hqlbj" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.505911 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27107564-9a0b-4a63-894d-4e0960abd435-catalog-content\") pod \"redhat-marketplace-hqlbj\" (UID: \"27107564-9a0b-4a63-894d-4e0960abd435\") " pod="openshift-marketplace/redhat-marketplace-hqlbj" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.520126 4781 generic.go:334] "Generic (PLEG): container finished" podID="a3fc9b2b-26bf-4f7d-b052-c976f8584c43" containerID="f8112c905f7061e6694008c35ca2b50d6689e96ec0f9b105c3091f7f17b92eac" exitCode=0 Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.520164 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" event={"ID":"a3fc9b2b-26bf-4f7d-b052-c976f8584c43","Type":"ContainerDied","Data":"f8112c905f7061e6694008c35ca2b50d6689e96ec0f9b105c3091f7f17b92eac"} Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.524454 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nddw\" (UniqueName: \"kubernetes.io/projected/27107564-9a0b-4a63-894d-4e0960abd435-kube-api-access-2nddw\") pod \"redhat-marketplace-hqlbj\" (UID: \"27107564-9a0b-4a63-894d-4e0960abd435\") " pod="openshift-marketplace/redhat-marketplace-hqlbj" Dec 02 09:45:00 crc kubenswrapper[4781]: I1202 09:45:00.644346 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqlbj" Dec 02 09:45:01 crc kubenswrapper[4781]: I1202 09:45:01.824228 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" podUID="a3fc9b2b-26bf-4f7d-b052-c976f8584c43" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: connect: connection refused" Dec 02 09:45:03 crc kubenswrapper[4781]: I1202 09:45:03.991102 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67cff9c6-hg2l6" podUID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Dec 02 09:45:04 crc kubenswrapper[4781]: I1202 09:45:04.065336 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6856678494-4cprv" podUID="226317c7-a6f4-43c5-a3df-c9cb18b3afa5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Dec 02 09:45:06 crc kubenswrapper[4781]: I1202 09:45:06.823600 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" podUID="a3fc9b2b-26bf-4f7d-b052-c976f8584c43" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: connect: connection refused" Dec 02 09:45:09 crc kubenswrapper[4781]: I1202 09:45:09.866802 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:45:10 crc kubenswrapper[4781]: I1202 09:45:10.617956 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmr7s" event={"ID":"c1554dce-519d-4004-a1a3-a1c9072dc609","Type":"ContainerStarted","Data":"7979fff52111791c9357dc197a63cfd74ff2a38ffb1bd960456887d7f53a158a"} Dec 02 09:45:10 crc kubenswrapper[4781]: I1202 09:45:10.637734 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qmr7s" podStartSLOduration=24.85293522 podStartE2EDuration="30.637715938s" podCreationTimestamp="2025-12-02 09:44:40 +0000 UTC" firstStartedPulling="2025-12-02 09:44:46.218559841 +0000 UTC m=+1449.042433720" lastFinishedPulling="2025-12-02 09:44:52.003340559 +0000 UTC m=+1454.827214438" observedRunningTime="2025-12-02 09:45:10.633200027 +0000 UTC m=+1473.457073926" watchObservedRunningTime="2025-12-02 09:45:10.637715938 +0000 UTC m=+1473.461589817" Dec 02 09:45:10 crc kubenswrapper[4781]: E1202 09:45:10.782192 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Dec 02 09:45:10 crc kubenswrapper[4781]: E1202 09:45:10.800158 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f8tv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(c98b4bc6-c086-4663-b794-92a36b0da2ba): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 09:45:10 crc kubenswrapper[4781]: E1202 09:45:10.801781 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="c98b4bc6-c086-4663-b794-92a36b0da2ba" Dec 02 09:45:10 crc kubenswrapper[4781]: I1202 09:45:10.812268 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qmr7s" Dec 02 09:45:10 crc kubenswrapper[4781]: I1202 09:45:10.812350 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qmr7s" Dec 02 09:45:10 crc kubenswrapper[4781]: I1202 09:45:10.949372 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-69459794d8-ph7dr" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.011603 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.014858 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.025635 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.195277 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dk52\" (UniqueName: \"kubernetes.io/projected/2be1f200-d68d-47e5-889a-fea41af06a96-kube-api-access-9dk52\") pod \"2be1f200-d68d-47e5-889a-fea41af06a96\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.196281 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t94m\" (UniqueName: \"kubernetes.io/projected/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-kube-api-access-4t94m\") pod \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.196355 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-config-data\") pod \"2be1f200-d68d-47e5-889a-fea41af06a96\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.196386 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-dns-svc\") pod \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.197608 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-httpd-run\") pod \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.197678 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-config\") pod \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.197769 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2be1f200-d68d-47e5-889a-fea41af06a96-logs\") pod \"2be1f200-d68d-47e5-889a-fea41af06a96\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.197845 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-combined-ca-bundle\") pod \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.197963 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-scripts\") pod \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.197987 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-logs\") pod \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.198035 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-config-data\") pod \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.198062 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpt2f\" (UniqueName: \"kubernetes.io/projected/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-kube-api-access-dpt2f\") pod \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.198089 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2be1f200-d68d-47e5-889a-fea41af06a96\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.198175 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-ovsdbserver-nb\") pod \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.198202 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-combined-ca-bundle\") pod \"2be1f200-d68d-47e5-889a-fea41af06a96\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.198240 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-dns-swift-storage-0\") pod \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.198310 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2be1f200-d68d-47e5-889a-fea41af06a96-httpd-run\") pod \"2be1f200-d68d-47e5-889a-fea41af06a96\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.198340 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-scripts\") pod \"2be1f200-d68d-47e5-889a-fea41af06a96\" (UID: \"2be1f200-d68d-47e5-889a-fea41af06a96\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.198372 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-ovsdbserver-sb\") pod \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\" (UID: \"a3fc9b2b-26bf-4f7d-b052-c976f8584c43\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.198392 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\" (UID: \"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c\") " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.205410 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-logs" (OuterVolumeSpecName: "logs") pod "b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" (UID: "b81fe2e7-695a-465c-b06f-aec5bd8f5a4c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.205810 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" (UID: "b81fe2e7-695a-465c-b06f-aec5bd8f5a4c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.206176 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2be1f200-d68d-47e5-889a-fea41af06a96-logs" (OuterVolumeSpecName: "logs") pod "2be1f200-d68d-47e5-889a-fea41af06a96" (UID: "2be1f200-d68d-47e5-889a-fea41af06a96"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.208462 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" (UID: "b81fe2e7-695a-465c-b06f-aec5bd8f5a4c"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.208794 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-kube-api-access-4t94m" (OuterVolumeSpecName: "kube-api-access-4t94m") pod "b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" (UID: "b81fe2e7-695a-465c-b06f-aec5bd8f5a4c"). InnerVolumeSpecName "kube-api-access-4t94m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.208885 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2be1f200-d68d-47e5-889a-fea41af06a96-kube-api-access-9dk52" (OuterVolumeSpecName: "kube-api-access-9dk52") pod "2be1f200-d68d-47e5-889a-fea41af06a96" (UID: "2be1f200-d68d-47e5-889a-fea41af06a96"). InnerVolumeSpecName "kube-api-access-9dk52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.210218 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2be1f200-d68d-47e5-889a-fea41af06a96-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2be1f200-d68d-47e5-889a-fea41af06a96" (UID: "2be1f200-d68d-47e5-889a-fea41af06a96"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.214225 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-scripts" (OuterVolumeSpecName: "scripts") pod "2be1f200-d68d-47e5-889a-fea41af06a96" (UID: "2be1f200-d68d-47e5-889a-fea41af06a96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.221125 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-kube-api-access-dpt2f" (OuterVolumeSpecName: "kube-api-access-dpt2f") pod "a3fc9b2b-26bf-4f7d-b052-c976f8584c43" (UID: "a3fc9b2b-26bf-4f7d-b052-c976f8584c43"). InnerVolumeSpecName "kube-api-access-dpt2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.223285 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "2be1f200-d68d-47e5-889a-fea41af06a96" (UID: "2be1f200-d68d-47e5-889a-fea41af06a96"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.227126 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-scripts" (OuterVolumeSpecName: "scripts") pod "b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" (UID: "b81fe2e7-695a-465c-b06f-aec5bd8f5a4c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.257881 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2be1f200-d68d-47e5-889a-fea41af06a96" (UID: "2be1f200-d68d-47e5-889a-fea41af06a96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.280081 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a3fc9b2b-26bf-4f7d-b052-c976f8584c43" (UID: "a3fc9b2b-26bf-4f7d-b052-c976f8584c43"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.282969 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3fc9b2b-26bf-4f7d-b052-c976f8584c43" (UID: "a3fc9b2b-26bf-4f7d-b052-c976f8584c43"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.285155 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" (UID: "b81fe2e7-695a-465c-b06f-aec5bd8f5a4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.300968 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.300994 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.301003 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2be1f200-d68d-47e5-889a-fea41af06a96-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.301011 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.301021 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.301028 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpt2f\" (UniqueName: \"kubernetes.io/projected/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-kube-api-access-dpt2f\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.301064 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.301085 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.301095 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.301103 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.301113 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2be1f200-d68d-47e5-889a-fea41af06a96-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.301121 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.301134 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.301145 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dk52\" (UniqueName: \"kubernetes.io/projected/2be1f200-d68d-47e5-889a-fea41af06a96-kube-api-access-9dk52\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.301154 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t94m\" (UniqueName: \"kubernetes.io/projected/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-kube-api-access-4t94m\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.327788 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a3fc9b2b-26bf-4f7d-b052-c976f8584c43" (UID: "a3fc9b2b-26bf-4f7d-b052-c976f8584c43"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.328444 4781 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.331687 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb"] Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.335188 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a3fc9b2b-26bf-4f7d-b052-c976f8584c43" (UID: "a3fc9b2b-26bf-4f7d-b052-c976f8584c43"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.344867 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-config" (OuterVolumeSpecName: "config") pod "a3fc9b2b-26bf-4f7d-b052-c976f8584c43" (UID: "a3fc9b2b-26bf-4f7d-b052-c976f8584c43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.345019 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-config-data" (OuterVolumeSpecName: "config-data") pod "b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" (UID: "b81fe2e7-695a-465c-b06f-aec5bd8f5a4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.345986 4781 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.356363 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-config-data" (OuterVolumeSpecName: "config-data") pod "2be1f200-d68d-47e5-889a-fea41af06a96" (UID: "2be1f200-d68d-47e5-889a-fea41af06a96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.403707 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.403745 4781 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.403756 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.403767 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.403777 4781 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.403788 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be1f200-d68d-47e5-889a-fea41af06a96-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.403797 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3fc9b2b-26bf-4f7d-b052-c976f8584c43-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.512197 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqlbj"] Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.628117 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" event={"ID":"a3fc9b2b-26bf-4f7d-b052-c976f8584c43","Type":"ContainerDied","Data":"ebd4a9eae8deabbc1b5a96e0bc8a7d930c0ecdc362d3990bddbbb8bc009626ff"} Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.628154 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-n7z2z" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.628176 4781 scope.go:117] "RemoveContainer" containerID="f8112c905f7061e6694008c35ca2b50d6689e96ec0f9b105c3091f7f17b92eac" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.630387 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b81fe2e7-695a-465c-b06f-aec5bd8f5a4c","Type":"ContainerDied","Data":"5f1428fb834c577783ee679b320b070a178a623cfab3c936c86de371570a64f0"} Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.630440 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.633229 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c98b4bc6-c086-4663-b794-92a36b0da2ba" containerName="ceilometer-notification-agent" containerID="cri-o://d23653b0777bfb8823014de1063f3473f00585ce26d88d96ba403826a2ce3ec5" gracePeriod=30 Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.633336 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.633673 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2be1f200-d68d-47e5-889a-fea41af06a96","Type":"ContainerDied","Data":"4cef4be967f2b0737bf78b7cddbefb2388e3d99e077f7b50ec1899d640de5e78"} Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.633829 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c98b4bc6-c086-4663-b794-92a36b0da2ba" containerName="sg-core" containerID="cri-o://0835ce989dbc0e513f8ecff2e64616304cb06d329a9b73a5d2ae1312c75e2ac7" gracePeriod=30 Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.684815 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.692740 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.700367 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.707787 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.718263 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-n7z2z"] Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.726066 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:45:11 crc kubenswrapper[4781]: E1202 09:45:11.726501 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3fc9b2b-26bf-4f7d-b052-c976f8584c43" containerName="init" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.726519 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3fc9b2b-26bf-4f7d-b052-c976f8584c43" containerName="init" Dec 02 09:45:11 crc kubenswrapper[4781]: E1202 09:45:11.726538 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be1f200-d68d-47e5-889a-fea41af06a96" containerName="glance-log" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.726545 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be1f200-d68d-47e5-889a-fea41af06a96" containerName="glance-log" Dec 02 09:45:11 crc kubenswrapper[4781]: E1202 09:45:11.726559 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" containerName="glance-log" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.726565 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" containerName="glance-log" Dec 02 09:45:11 crc kubenswrapper[4781]: E1202 09:45:11.726572 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be1f200-d68d-47e5-889a-fea41af06a96" containerName="glance-httpd" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.726578 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be1f200-d68d-47e5-889a-fea41af06a96" containerName="glance-httpd" Dec 02 09:45:11 crc kubenswrapper[4781]: E1202 09:45:11.726600 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3fc9b2b-26bf-4f7d-b052-c976f8584c43" containerName="dnsmasq-dns" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.726606 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3fc9b2b-26bf-4f7d-b052-c976f8584c43" containerName="dnsmasq-dns" Dec 02 09:45:11 crc kubenswrapper[4781]: E1202 09:45:11.726616 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" containerName="glance-httpd" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.726623 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" containerName="glance-httpd" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.726790 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" containerName="glance-log" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.726807 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2be1f200-d68d-47e5-889a-fea41af06a96" containerName="glance-httpd" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.726816 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" containerName="glance-httpd" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.726837 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2be1f200-d68d-47e5-889a-fea41af06a96" containerName="glance-log" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.726846 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3fc9b2b-26bf-4f7d-b052-c976f8584c43" containerName="dnsmasq-dns" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.727830 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.730338 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.730722 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.730725 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.730729 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-shwpc" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.735640 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-n7z2z"] Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.750822 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.752398 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.756509 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.758976 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.771980 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.778894 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.780765 4781 scope.go:117] "RemoveContainer" containerID="009eb4629af2ac941e9f4f81e6768ce1a094bb826a60e3fd927574406f7a67be" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.812879 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/269490ed-f426-4577-b60d-695881e29aac-logs\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.813313 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqjjd\" (UniqueName: \"kubernetes.io/projected/269490ed-f426-4577-b60d-695881e29aac-kube-api-access-zqjjd\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.813358 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-config-data\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.813428 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/269490ed-f426-4577-b60d-695881e29aac-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.813492 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.813527 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-scripts\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.813553 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.813600 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.876606 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-qmr7s" podUID="c1554dce-519d-4004-a1a3-a1c9072dc609" containerName="registry-server" probeResult="failure" output=< Dec 02 09:45:11 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Dec 02 09:45:11 crc kubenswrapper[4781]: > Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.915504 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a44acc8-6e3c-4643-9b69-34f54b53c188-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.915576 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-config-data\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.916302 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a44acc8-6e3c-4643-9b69-34f54b53c188-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.916397 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.916426 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.916725 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/269490ed-f426-4577-b60d-695881e29aac-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.916761 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.916810 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.916829 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.916860 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.916904 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-scripts\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.916953 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.917031 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.917374 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7lv5\" (UniqueName: \"kubernetes.io/projected/8a44acc8-6e3c-4643-9b69-34f54b53c188-kube-api-access-c7lv5\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.917404 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/269490ed-f426-4577-b60d-695881e29aac-logs\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.917419 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqjjd\" (UniqueName: \"kubernetes.io/projected/269490ed-f426-4577-b60d-695881e29aac-kube-api-access-zqjjd\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.917496 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/269490ed-f426-4577-b60d-695881e29aac-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.917763 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.918226 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/269490ed-f426-4577-b60d-695881e29aac-logs\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.927229 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-config-data\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.927693 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.931377 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.934541 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-scripts\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.941317 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqjjd\" (UniqueName: \"kubernetes.io/projected/269490ed-f426-4577-b60d-695881e29aac-kube-api-access-zqjjd\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:11 crc kubenswrapper[4781]: I1202 09:45:11.955487 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " pod="openstack/glance-default-external-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.002367 4781 scope.go:117] "RemoveContainer" containerID="f0edb10719d95ee9c62291583c48aaf88a403126d29381146b6d778cb697bb15" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.018577 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7lv5\" (UniqueName: \"kubernetes.io/projected/8a44acc8-6e3c-4643-9b69-34f54b53c188-kube-api-access-c7lv5\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.018638 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a44acc8-6e3c-4643-9b69-34f54b53c188-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.018691 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a44acc8-6e3c-4643-9b69-34f54b53c188-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.018723 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.018744 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.018786 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.018821 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.018841 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.019463 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.019796 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a44acc8-6e3c-4643-9b69-34f54b53c188-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.020621 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a44acc8-6e3c-4643-9b69-34f54b53c188-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.023631 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.024198 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.024872 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.026242 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.038355 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7lv5\" (UniqueName: \"kubernetes.io/projected/8a44acc8-6e3c-4643-9b69-34f54b53c188-kube-api-access-c7lv5\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.047152 4781 scope.go:117] "RemoveContainer" containerID="9a77cad9482f507ac66337d55bc6129c12301e8c5a561bba433cf1893855cfb9" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.051708 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.068958 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.108882 4781 scope.go:117] "RemoveContainer" containerID="32e0effa1f7e2a896eed743c86a8d58a568201bf1d4a082b7b231c98380bdf94" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.137874 4781 scope.go:117] "RemoveContainer" containerID="5c14a0948ae197de8effd63c9e112116d56984890a8ece0da5345e5c043a9d0b" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.349950 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.648979 4781 generic.go:334] "Generic (PLEG): container finished" podID="c98b4bc6-c086-4663-b794-92a36b0da2ba" containerID="0835ce989dbc0e513f8ecff2e64616304cb06d329a9b73a5d2ae1312c75e2ac7" exitCode=2 Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.649289 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c98b4bc6-c086-4663-b794-92a36b0da2ba","Type":"ContainerDied","Data":"0835ce989dbc0e513f8ecff2e64616304cb06d329a9b73a5d2ae1312c75e2ac7"} Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.652105 4781 generic.go:334] "Generic (PLEG): container finished" podID="bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3" containerID="d3c616324f0e8c79eea62ce3082674854684052bf6b68557a0480855e240a9ed" exitCode=0 Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.652191 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb" event={"ID":"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3","Type":"ContainerDied","Data":"d3c616324f0e8c79eea62ce3082674854684052bf6b68557a0480855e240a9ed"} Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.652271 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb" event={"ID":"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3","Type":"ContainerStarted","Data":"b145734078e75e2ec314f75972d5b6aab9e46a89511fc7b8f97f2beea65a81b7"} Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.674973 4781 generic.go:334] "Generic (PLEG): container finished" podID="27107564-9a0b-4a63-894d-4e0960abd435" containerID="944d592c4f3b5b0e74cd4f7ba1b76e8fb09d21d3fca2599dfb9aa627c26da805" exitCode=0 Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.675520 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqlbj" event={"ID":"27107564-9a0b-4a63-894d-4e0960abd435","Type":"ContainerDied","Data":"944d592c4f3b5b0e74cd4f7ba1b76e8fb09d21d3fca2599dfb9aa627c26da805"} Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.675578 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqlbj" event={"ID":"27107564-9a0b-4a63-894d-4e0960abd435","Type":"ContainerStarted","Data":"d0045191e9cda04f8a5348368d03b33c9219e972a56d6c7bb22939830eaee14f"} Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.683839 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-br5j5" event={"ID":"51f721e2-e8fd-4ae1-89d9-fe7272e8246e","Type":"ContainerStarted","Data":"a07f2009cf58fc528ef5769303b895ed34faf022ea7551f54cb14cce5a190499"} Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.688507 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.728441 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-br5j5" podStartSLOduration=3.454866619 podStartE2EDuration="1m28.728420497s" podCreationTimestamp="2025-12-02 09:43:44 +0000 UTC" firstStartedPulling="2025-12-02 09:43:46.548392578 +0000 UTC m=+1389.372266457" lastFinishedPulling="2025-12-02 09:45:11.821946436 +0000 UTC m=+1474.645820335" observedRunningTime="2025-12-02 09:45:12.723082505 +0000 UTC m=+1475.546956394" watchObservedRunningTime="2025-12-02 09:45:12.728420497 +0000 UTC m=+1475.552294376" Dec 02 09:45:12 crc kubenswrapper[4781]: I1202 09:45:12.905626 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:45:13 crc kubenswrapper[4781]: I1202 09:45:13.519513 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2be1f200-d68d-47e5-889a-fea41af06a96" path="/var/lib/kubelet/pods/2be1f200-d68d-47e5-889a-fea41af06a96/volumes" Dec 02 09:45:13 crc kubenswrapper[4781]: I1202 09:45:13.525309 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3fc9b2b-26bf-4f7d-b052-c976f8584c43" path="/var/lib/kubelet/pods/a3fc9b2b-26bf-4f7d-b052-c976f8584c43/volumes" Dec 02 09:45:13 crc kubenswrapper[4781]: I1202 09:45:13.526387 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b81fe2e7-695a-465c-b06f-aec5bd8f5a4c" path="/var/lib/kubelet/pods/b81fe2e7-695a-465c-b06f-aec5bd8f5a4c/volumes" Dec 02 09:45:13 crc kubenswrapper[4781]: I1202 09:45:13.727702 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqlbj" event={"ID":"27107564-9a0b-4a63-894d-4e0960abd435","Type":"ContainerStarted","Data":"cc118d6e0785fb3227a6994d0a9f17b361442a9718d55ec98de1715b16fb25bc"} Dec 02 09:45:13 crc kubenswrapper[4781]: I1202 09:45:13.731141 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"269490ed-f426-4577-b60d-695881e29aac","Type":"ContainerStarted","Data":"e0960176bdda8e2ba82d90641ba566c3b6c5c7a6ec61704be70fbd7a1daf1d81"} Dec 02 09:45:13 crc kubenswrapper[4781]: I1202 09:45:13.731234 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"269490ed-f426-4577-b60d-695881e29aac","Type":"ContainerStarted","Data":"90a5675073e712ecefae3610b578c1e9effbdc04122f19049db08c76066ec5bb"} Dec 02 09:45:13 crc kubenswrapper[4781]: I1202 09:45:13.732819 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a44acc8-6e3c-4643-9b69-34f54b53c188","Type":"ContainerStarted","Data":"2626694bb2556bccbe5290d2d9b12d69989c1d612b985be1232da4748cb26470"} Dec 02 09:45:13 crc kubenswrapper[4781]: I1202 09:45:13.732872 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a44acc8-6e3c-4643-9b69-34f54b53c188","Type":"ContainerStarted","Data":"3baf14ffbdb20ebdb7786666412a2b63c3a0fb9260c5bf2f8181549180613ef8"} Dec 02 09:45:14 crc kubenswrapper[4781]: I1202 09:45:14.191688 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb" Dec 02 09:45:14 crc kubenswrapper[4781]: I1202 09:45:14.266962 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx89t\" (UniqueName: \"kubernetes.io/projected/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-kube-api-access-nx89t\") pod \"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3\" (UID: \"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3\") " Dec 02 09:45:14 crc kubenswrapper[4781]: I1202 09:45:14.267128 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-secret-volume\") pod \"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3\" (UID: \"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3\") " Dec 02 09:45:14 crc kubenswrapper[4781]: I1202 09:45:14.267194 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-config-volume\") pod \"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3\" (UID: \"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3\") " Dec 02 09:45:14 crc kubenswrapper[4781]: I1202 09:45:14.268243 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-config-volume" (OuterVolumeSpecName: "config-volume") pod "bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3" (UID: "bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:14 crc kubenswrapper[4781]: I1202 09:45:14.272124 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-kube-api-access-nx89t" (OuterVolumeSpecName: "kube-api-access-nx89t") pod "bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3" (UID: "bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3"). InnerVolumeSpecName "kube-api-access-nx89t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:45:14 crc kubenswrapper[4781]: I1202 09:45:14.276487 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3" (UID: "bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:14 crc kubenswrapper[4781]: I1202 09:45:14.370994 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:14 crc kubenswrapper[4781]: I1202 09:45:14.371189 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:14 crc kubenswrapper[4781]: I1202 09:45:14.371253 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx89t\" (UniqueName: \"kubernetes.io/projected/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3-kube-api-access-nx89t\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:14 crc kubenswrapper[4781]: I1202 09:45:14.740964 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb" Dec 02 09:45:14 crc kubenswrapper[4781]: I1202 09:45:14.740965 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb" event={"ID":"bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3","Type":"ContainerDied","Data":"b145734078e75e2ec314f75972d5b6aab9e46a89511fc7b8f97f2beea65a81b7"} Dec 02 09:45:14 crc kubenswrapper[4781]: I1202 09:45:14.741510 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b145734078e75e2ec314f75972d5b6aab9e46a89511fc7b8f97f2beea65a81b7" Dec 02 09:45:14 crc kubenswrapper[4781]: I1202 09:45:14.742968 4781 generic.go:334] "Generic (PLEG): container finished" podID="27107564-9a0b-4a63-894d-4e0960abd435" containerID="cc118d6e0785fb3227a6994d0a9f17b361442a9718d55ec98de1715b16fb25bc" exitCode=0 Dec 02 09:45:14 crc kubenswrapper[4781]: I1202 09:45:14.743087 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqlbj" event={"ID":"27107564-9a0b-4a63-894d-4e0960abd435","Type":"ContainerDied","Data":"cc118d6e0785fb3227a6994d0a9f17b361442a9718d55ec98de1715b16fb25bc"} Dec 02 09:45:14 crc kubenswrapper[4781]: I1202 09:45:14.745490 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"269490ed-f426-4577-b60d-695881e29aac","Type":"ContainerStarted","Data":"a974bce8f6f1af663c342580c805ddbaa6097b40912021df63435e438216c135"} Dec 02 09:45:14 crc kubenswrapper[4781]: I1202 09:45:14.803065 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.803046809 podStartE2EDuration="3.803046809s" podCreationTimestamp="2025-12-02 09:45:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:45:14.793505716 +0000 UTC m=+1477.617379585" watchObservedRunningTime="2025-12-02 09:45:14.803046809 +0000 UTC m=+1477.626920688" Dec 02 09:45:15 crc kubenswrapper[4781]: I1202 09:45:15.755425 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a44acc8-6e3c-4643-9b69-34f54b53c188","Type":"ContainerStarted","Data":"3ccee59f51f4b55106b87ba1b23d1a23f762a2b56edb990a2065c1a6fc1bcfdb"} Dec 02 09:45:16 crc kubenswrapper[4781]: I1202 09:45:16.103120 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:45:16 crc kubenswrapper[4781]: I1202 09:45:16.103370 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6856678494-4cprv" Dec 02 09:45:16 crc kubenswrapper[4781]: I1202 09:45:16.786707 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.786686911 podStartE2EDuration="5.786686911s" podCreationTimestamp="2025-12-02 09:45:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:45:16.778524884 +0000 UTC m=+1479.602398783" watchObservedRunningTime="2025-12-02 09:45:16.786686911 +0000 UTC m=+1479.610560790" Dec 02 09:45:17 crc kubenswrapper[4781]: I1202 09:45:17.772454 4781 generic.go:334] "Generic (PLEG): container finished" podID="c98b4bc6-c086-4663-b794-92a36b0da2ba" containerID="d23653b0777bfb8823014de1063f3473f00585ce26d88d96ba403826a2ce3ec5" exitCode=0 Dec 02 09:45:17 crc kubenswrapper[4781]: I1202 09:45:17.772502 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c98b4bc6-c086-4663-b794-92a36b0da2ba","Type":"ContainerDied","Data":"d23653b0777bfb8823014de1063f3473f00585ce26d88d96ba403826a2ce3ec5"} Dec 02 09:45:17 crc kubenswrapper[4781]: I1202 09:45:17.867943 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6856678494-4cprv" Dec 02 09:45:17 crc kubenswrapper[4781]: I1202 09:45:17.932846 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67cff9c6-hg2l6"] Dec 02 09:45:17 crc kubenswrapper[4781]: I1202 09:45:17.934756 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67cff9c6-hg2l6" podUID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" containerName="horizon-log" containerID="cri-o://48b12156148830c711ba4ea9c2a408455c26ae2b9baec0f6f03c6f3d2a1699e3" gracePeriod=30 Dec 02 09:45:17 crc kubenswrapper[4781]: I1202 09:45:17.935199 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67cff9c6-hg2l6" podUID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" containerName="horizon" containerID="cri-o://e4f8d80de8d7fdd7aed53bd0df8818adc65f193e5a52808c1a72181a16906549" gracePeriod=30 Dec 02 09:45:17 crc kubenswrapper[4781]: I1202 09:45:17.941771 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67cff9c6-hg2l6" podUID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.455679 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-cbfc4ddfb-kljg5" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.556142 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.651975 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-combined-ca-bundle\") pod \"c98b4bc6-c086-4663-b794-92a36b0da2ba\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.652051 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-scripts\") pod \"c98b4bc6-c086-4663-b794-92a36b0da2ba\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.652093 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-sg-core-conf-yaml\") pod \"c98b4bc6-c086-4663-b794-92a36b0da2ba\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.652122 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c98b4bc6-c086-4663-b794-92a36b0da2ba-run-httpd\") pod \"c98b4bc6-c086-4663-b794-92a36b0da2ba\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.652164 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-config-data\") pod \"c98b4bc6-c086-4663-b794-92a36b0da2ba\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.652187 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8tv4\" (UniqueName: \"kubernetes.io/projected/c98b4bc6-c086-4663-b794-92a36b0da2ba-kube-api-access-f8tv4\") pod \"c98b4bc6-c086-4663-b794-92a36b0da2ba\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.652220 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c98b4bc6-c086-4663-b794-92a36b0da2ba-log-httpd\") pod \"c98b4bc6-c086-4663-b794-92a36b0da2ba\" (UID: \"c98b4bc6-c086-4663-b794-92a36b0da2ba\") " Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.652674 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c98b4bc6-c086-4663-b794-92a36b0da2ba-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c98b4bc6-c086-4663-b794-92a36b0da2ba" (UID: "c98b4bc6-c086-4663-b794-92a36b0da2ba"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.653282 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c98b4bc6-c086-4663-b794-92a36b0da2ba-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c98b4bc6-c086-4663-b794-92a36b0da2ba" (UID: "c98b4bc6-c086-4663-b794-92a36b0da2ba"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.676785 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98b4bc6-c086-4663-b794-92a36b0da2ba-kube-api-access-f8tv4" (OuterVolumeSpecName: "kube-api-access-f8tv4") pod "c98b4bc6-c086-4663-b794-92a36b0da2ba" (UID: "c98b4bc6-c086-4663-b794-92a36b0da2ba"). InnerVolumeSpecName "kube-api-access-f8tv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.679630 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-scripts" (OuterVolumeSpecName: "scripts") pod "c98b4bc6-c086-4663-b794-92a36b0da2ba" (UID: "c98b4bc6-c086-4663-b794-92a36b0da2ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.684107 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c98b4bc6-c086-4663-b794-92a36b0da2ba" (UID: "c98b4bc6-c086-4663-b794-92a36b0da2ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.684560 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-config-data" (OuterVolumeSpecName: "config-data") pod "c98b4bc6-c086-4663-b794-92a36b0da2ba" (UID: "c98b4bc6-c086-4663-b794-92a36b0da2ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.696274 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c98b4bc6-c086-4663-b794-92a36b0da2ba" (UID: "c98b4bc6-c086-4663-b794-92a36b0da2ba"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.755117 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.755162 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8tv4\" (UniqueName: \"kubernetes.io/projected/c98b4bc6-c086-4663-b794-92a36b0da2ba-kube-api-access-f8tv4\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.755175 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c98b4bc6-c086-4663-b794-92a36b0da2ba-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.755183 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.755191 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.755200 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c98b4bc6-c086-4663-b794-92a36b0da2ba-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.755211 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c98b4bc6-c086-4663-b794-92a36b0da2ba-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.786782 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c98b4bc6-c086-4663-b794-92a36b0da2ba","Type":"ContainerDied","Data":"44e12cdd3a8c0000cc4a26e70745048b4b823c0c4c8169ad69d09a356dc909a9"} Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.786846 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.786847 4781 scope.go:117] "RemoveContainer" containerID="0835ce989dbc0e513f8ecff2e64616304cb06d329a9b73a5d2ae1312c75e2ac7" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.881106 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.891237 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.902780 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:45:18 crc kubenswrapper[4781]: E1202 09:45:18.903429 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98b4bc6-c086-4663-b794-92a36b0da2ba" containerName="ceilometer-notification-agent" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.903455 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98b4bc6-c086-4663-b794-92a36b0da2ba" containerName="ceilometer-notification-agent" Dec 02 09:45:18 crc kubenswrapper[4781]: E1202 09:45:18.903488 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98b4bc6-c086-4663-b794-92a36b0da2ba" containerName="sg-core" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.903498 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98b4bc6-c086-4663-b794-92a36b0da2ba" containerName="sg-core" Dec 02 09:45:18 crc kubenswrapper[4781]: E1202 09:45:18.903517 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3" containerName="collect-profiles" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.903526 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3" containerName="collect-profiles" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.903737 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98b4bc6-c086-4663-b794-92a36b0da2ba" containerName="ceilometer-notification-agent" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.903757 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3" containerName="collect-profiles" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.903775 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98b4bc6-c086-4663-b794-92a36b0da2ba" containerName="sg-core" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.905837 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.907691 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.909815 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.913587 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.939724 4781 scope.go:117] "RemoveContainer" containerID="d23653b0777bfb8823014de1063f3473f00585ce26d88d96ba403826a2ce3ec5" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.961391 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.961451 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfwzk\" (UniqueName: \"kubernetes.io/projected/e1708d43-f8b6-491b-a365-a8542137dd44-kube-api-access-bfwzk\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.961534 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.961641 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-config-data\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.961734 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1708d43-f8b6-491b-a365-a8542137dd44-log-httpd\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.961795 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1708d43-f8b6-491b-a365-a8542137dd44-run-httpd\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:18 crc kubenswrapper[4781]: I1202 09:45:18.961812 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-scripts\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.063042 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.063088 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-config-data\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.063119 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1708d43-f8b6-491b-a365-a8542137dd44-log-httpd\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.063144 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1708d43-f8b6-491b-a365-a8542137dd44-run-httpd\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.063162 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-scripts\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.063203 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.063230 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfwzk\" (UniqueName: \"kubernetes.io/projected/e1708d43-f8b6-491b-a365-a8542137dd44-kube-api-access-bfwzk\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.065322 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1708d43-f8b6-491b-a365-a8542137dd44-run-httpd\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.066290 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1708d43-f8b6-491b-a365-a8542137dd44-log-httpd\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.069485 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-scripts\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.071731 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.072672 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-config-data\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.078252 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.082560 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfwzk\" (UniqueName: \"kubernetes.io/projected/e1708d43-f8b6-491b-a365-a8542137dd44-kube-api-access-bfwzk\") pod \"ceilometer-0\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " pod="openstack/ceilometer-0" Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.226218 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.513976 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c98b4bc6-c086-4663-b794-92a36b0da2ba" path="/var/lib/kubelet/pods/c98b4bc6-c086-4663-b794-92a36b0da2ba/volumes" Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.789519 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:45:19 crc kubenswrapper[4781]: I1202 09:45:19.810164 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqlbj" event={"ID":"27107564-9a0b-4a63-894d-4e0960abd435","Type":"ContainerStarted","Data":"218a370cd990992fc6c3374bc43b115828b635373940f9fb482c34d673ba42f9"} Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.063075 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.064462 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.066429 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.066543 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-pmttz" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.066626 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.072977 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.180609 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4f2c\" (UniqueName: \"kubernetes.io/projected/7a424309-12e4-42f9-ba35-d61f1f6c7b44-kube-api-access-t4f2c\") pod \"openstackclient\" (UID: \"7a424309-12e4-42f9-ba35-d61f1f6c7b44\") " pod="openstack/openstackclient" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.180700 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7a424309-12e4-42f9-ba35-d61f1f6c7b44-openstack-config-secret\") pod \"openstackclient\" (UID: \"7a424309-12e4-42f9-ba35-d61f1f6c7b44\") " pod="openstack/openstackclient" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.180761 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a424309-12e4-42f9-ba35-d61f1f6c7b44-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7a424309-12e4-42f9-ba35-d61f1f6c7b44\") " pod="openstack/openstackclient" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.180904 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7a424309-12e4-42f9-ba35-d61f1f6c7b44-openstack-config\") pod \"openstackclient\" (UID: \"7a424309-12e4-42f9-ba35-d61f1f6c7b44\") " pod="openstack/openstackclient" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.282267 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7a424309-12e4-42f9-ba35-d61f1f6c7b44-openstack-config\") pod \"openstackclient\" (UID: \"7a424309-12e4-42f9-ba35-d61f1f6c7b44\") " pod="openstack/openstackclient" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.282412 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4f2c\" (UniqueName: \"kubernetes.io/projected/7a424309-12e4-42f9-ba35-d61f1f6c7b44-kube-api-access-t4f2c\") pod \"openstackclient\" (UID: \"7a424309-12e4-42f9-ba35-d61f1f6c7b44\") " pod="openstack/openstackclient" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.282479 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7a424309-12e4-42f9-ba35-d61f1f6c7b44-openstack-config-secret\") pod \"openstackclient\" (UID: \"7a424309-12e4-42f9-ba35-d61f1f6c7b44\") " pod="openstack/openstackclient" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.282503 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a424309-12e4-42f9-ba35-d61f1f6c7b44-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7a424309-12e4-42f9-ba35-d61f1f6c7b44\") " pod="openstack/openstackclient" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.283529 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7a424309-12e4-42f9-ba35-d61f1f6c7b44-openstack-config\") pod \"openstackclient\" (UID: \"7a424309-12e4-42f9-ba35-d61f1f6c7b44\") " pod="openstack/openstackclient" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.289454 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a424309-12e4-42f9-ba35-d61f1f6c7b44-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7a424309-12e4-42f9-ba35-d61f1f6c7b44\") " pod="openstack/openstackclient" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.309382 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7a424309-12e4-42f9-ba35-d61f1f6c7b44-openstack-config-secret\") pod \"openstackclient\" (UID: \"7a424309-12e4-42f9-ba35-d61f1f6c7b44\") " pod="openstack/openstackclient" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.312539 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4f2c\" (UniqueName: \"kubernetes.io/projected/7a424309-12e4-42f9-ba35-d61f1f6c7b44-kube-api-access-t4f2c\") pod \"openstackclient\" (UID: \"7a424309-12e4-42f9-ba35-d61f1f6c7b44\") " pod="openstack/openstackclient" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.389048 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.820243 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1708d43-f8b6-491b-a365-a8542137dd44","Type":"ContainerStarted","Data":"64e45df7ff96e97983a336361c74ddb4fb41878cecf8133caf8370036da8e589"} Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.840417 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hqlbj" podStartSLOduration=14.240832225 podStartE2EDuration="20.840396703s" podCreationTimestamp="2025-12-02 09:45:00 +0000 UTC" firstStartedPulling="2025-12-02 09:45:12.678028557 +0000 UTC m=+1475.501902436" lastFinishedPulling="2025-12-02 09:45:19.277593035 +0000 UTC m=+1482.101466914" observedRunningTime="2025-12-02 09:45:20.838005341 +0000 UTC m=+1483.661879240" watchObservedRunningTime="2025-12-02 09:45:20.840396703 +0000 UTC m=+1483.664270582" Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.884414 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qmr7s" Dec 02 09:45:20 crc kubenswrapper[4781]: W1202 09:45:20.935701 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a424309_12e4_42f9_ba35_d61f1f6c7b44.slice/crio-cce51d4e859d3f2005db05b32b0e899173319076261d831117393fc7818ea4a0 WatchSource:0}: Error finding container cce51d4e859d3f2005db05b32b0e899173319076261d831117393fc7818ea4a0: Status 404 returned error can't find the container with id cce51d4e859d3f2005db05b32b0e899173319076261d831117393fc7818ea4a0 Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.943496 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 09:45:20 crc kubenswrapper[4781]: I1202 09:45:20.949157 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qmr7s" Dec 02 09:45:21 crc kubenswrapper[4781]: I1202 09:45:21.066178 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67cff9c6-hg2l6" podUID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:43642->10.217.0.142:8443: read: connection reset by peer" Dec 02 09:45:21 crc kubenswrapper[4781]: I1202 09:45:21.829318 4781 generic.go:334] "Generic (PLEG): container finished" podID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" containerID="e4f8d80de8d7fdd7aed53bd0df8818adc65f193e5a52808c1a72181a16906549" exitCode=0 Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:21.829358 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67cff9c6-hg2l6" event={"ID":"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e","Type":"ContainerDied","Data":"e4f8d80de8d7fdd7aed53bd0df8818adc65f193e5a52808c1a72181a16906549"} Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:21.830854 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7a424309-12e4-42f9-ba35-d61f1f6c7b44","Type":"ContainerStarted","Data":"cce51d4e859d3f2005db05b32b0e899173319076261d831117393fc7818ea4a0"} Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:22.053040 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:22.053096 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:22.056443 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qmr7s"] Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:22.088063 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:22.099655 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:22.350501 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:22.350570 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:22.383395 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:22.393553 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:22.837571 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qmr7s" podUID="c1554dce-519d-4004-a1a3-a1c9072dc609" containerName="registry-server" containerID="cri-o://7979fff52111791c9357dc197a63cfd74ff2a38ffb1bd960456887d7f53a158a" gracePeriod=2 Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:22.838272 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:22.838403 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:22.838433 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:22.838445 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:23.848600 4781 generic.go:334] "Generic (PLEG): container finished" podID="c1554dce-519d-4004-a1a3-a1c9072dc609" containerID="7979fff52111791c9357dc197a63cfd74ff2a38ffb1bd960456887d7f53a158a" exitCode=0 Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:23.848669 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmr7s" event={"ID":"c1554dce-519d-4004-a1a3-a1c9072dc609","Type":"ContainerDied","Data":"7979fff52111791c9357dc197a63cfd74ff2a38ffb1bd960456887d7f53a158a"} Dec 02 09:45:23 crc kubenswrapper[4781]: I1202 09:45:23.990253 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67cff9c6-hg2l6" podUID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.457452 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.457548 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.459390 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.502200 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmr7s" Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.576145 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.576198 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.688290 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1554dce-519d-4004-a1a3-a1c9072dc609-catalog-content\") pod \"c1554dce-519d-4004-a1a3-a1c9072dc609\" (UID: \"c1554dce-519d-4004-a1a3-a1c9072dc609\") " Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.688334 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1554dce-519d-4004-a1a3-a1c9072dc609-utilities\") pod \"c1554dce-519d-4004-a1a3-a1c9072dc609\" (UID: \"c1554dce-519d-4004-a1a3-a1c9072dc609\") " Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.688374 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vwxm\" (UniqueName: \"kubernetes.io/projected/c1554dce-519d-4004-a1a3-a1c9072dc609-kube-api-access-7vwxm\") pod \"c1554dce-519d-4004-a1a3-a1c9072dc609\" (UID: \"c1554dce-519d-4004-a1a3-a1c9072dc609\") " Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.691557 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1554dce-519d-4004-a1a3-a1c9072dc609-utilities" (OuterVolumeSpecName: "utilities") pod "c1554dce-519d-4004-a1a3-a1c9072dc609" (UID: "c1554dce-519d-4004-a1a3-a1c9072dc609"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.714237 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1554dce-519d-4004-a1a3-a1c9072dc609-kube-api-access-7vwxm" (OuterVolumeSpecName: "kube-api-access-7vwxm") pod "c1554dce-519d-4004-a1a3-a1c9072dc609" (UID: "c1554dce-519d-4004-a1a3-a1c9072dc609"). InnerVolumeSpecName "kube-api-access-7vwxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.771131 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1554dce-519d-4004-a1a3-a1c9072dc609-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1554dce-519d-4004-a1a3-a1c9072dc609" (UID: "c1554dce-519d-4004-a1a3-a1c9072dc609"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.790368 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1554dce-519d-4004-a1a3-a1c9072dc609-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.790406 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1554dce-519d-4004-a1a3-a1c9072dc609-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.790415 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vwxm\" (UniqueName: \"kubernetes.io/projected/c1554dce-519d-4004-a1a3-a1c9072dc609-kube-api-access-7vwxm\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.867416 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmr7s" event={"ID":"c1554dce-519d-4004-a1a3-a1c9072dc609","Type":"ContainerDied","Data":"e5b8a995cd9245d1b0c8a13b50ccc04f956126416f4da3c8c7ea35ddb424593f"} Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.867490 4781 scope.go:117] "RemoveContainer" containerID="7979fff52111791c9357dc197a63cfd74ff2a38ffb1bd960456887d7f53a158a" Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.867673 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmr7s" Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.896220 4781 scope.go:117] "RemoveContainer" containerID="bd12bd1057cb09ad2c1b1410b87d8c76e6bbad11640a2264957a1c7ab7c45be0" Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.908992 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qmr7s"] Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.924975 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qmr7s"] Dec 02 09:45:25 crc kubenswrapper[4781]: I1202 09:45:25.939778 4781 scope.go:117] "RemoveContainer" containerID="e55dc565488cf739e9cfcd3e7447e6185be476f3bb9dea571d0363f6e56bfaa6" Dec 02 09:45:26 crc kubenswrapper[4781]: I1202 09:45:26.876812 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1708d43-f8b6-491b-a365-a8542137dd44","Type":"ContainerStarted","Data":"5a8ee8bbbb0a8e89c8e39ea90569ff4a1fe26323f322f49fea79635b6c5afb47"} Dec 02 09:45:27 crc kubenswrapper[4781]: I1202 09:45:27.520484 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1554dce-519d-4004-a1a3-a1c9072dc609" path="/var/lib/kubelet/pods/c1554dce-519d-4004-a1a3-a1c9072dc609/volumes" Dec 02 09:45:27 crc kubenswrapper[4781]: I1202 09:45:27.889800 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1708d43-f8b6-491b-a365-a8542137dd44","Type":"ContainerStarted","Data":"e83e542d40461a80472e7eaed39b3591f79be6d61ce515a902e3ca725927583f"} Dec 02 09:45:28 crc kubenswrapper[4781]: I1202 09:45:28.900497 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1708d43-f8b6-491b-a365-a8542137dd44","Type":"ContainerStarted","Data":"6983d16fa5c64f540318073c4f04f369f98ac3334fdf57fae6821d28b5547935"} Dec 02 09:45:30 crc kubenswrapper[4781]: I1202 09:45:30.644611 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hqlbj" Dec 02 09:45:30 crc kubenswrapper[4781]: I1202 09:45:30.645755 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hqlbj" Dec 02 09:45:30 crc kubenswrapper[4781]: I1202 09:45:30.699750 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hqlbj" Dec 02 09:45:30 crc kubenswrapper[4781]: I1202 09:45:30.971720 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hqlbj" Dec 02 09:45:31 crc kubenswrapper[4781]: I1202 09:45:31.515681 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqlbj"] Dec 02 09:45:32 crc kubenswrapper[4781]: I1202 09:45:32.937242 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hqlbj" podUID="27107564-9a0b-4a63-894d-4e0960abd435" containerName="registry-server" containerID="cri-o://218a370cd990992fc6c3374bc43b115828b635373940f9fb482c34d673ba42f9" gracePeriod=2 Dec 02 09:45:33 crc kubenswrapper[4781]: I1202 09:45:33.991220 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67cff9c6-hg2l6" podUID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Dec 02 09:45:34 crc kubenswrapper[4781]: I1202 09:45:34.641140 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.044511 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7c5b6578c5-9cvpk"] Dec 02 09:45:35 crc kubenswrapper[4781]: E1202 09:45:35.044897 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1554dce-519d-4004-a1a3-a1c9072dc609" containerName="registry-server" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.044911 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1554dce-519d-4004-a1a3-a1c9072dc609" containerName="registry-server" Dec 02 09:45:35 crc kubenswrapper[4781]: E1202 09:45:35.044945 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1554dce-519d-4004-a1a3-a1c9072dc609" containerName="extract-utilities" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.044953 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1554dce-519d-4004-a1a3-a1c9072dc609" containerName="extract-utilities" Dec 02 09:45:35 crc kubenswrapper[4781]: E1202 09:45:35.044981 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1554dce-519d-4004-a1a3-a1c9072dc609" containerName="extract-content" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.044988 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1554dce-519d-4004-a1a3-a1c9072dc609" containerName="extract-content" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.045180 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1554dce-519d-4004-a1a3-a1c9072dc609" containerName="registry-server" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.046052 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.059453 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7c5b6578c5-9cvpk"] Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.059707 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.059870 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.060398 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.191541 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8666ba67-095e-4634-8975-e54bd7a0f0cb-internal-tls-certs\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.191599 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8666ba67-095e-4634-8975-e54bd7a0f0cb-run-httpd\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.191626 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8666ba67-095e-4634-8975-e54bd7a0f0cb-config-data\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.191643 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8666ba67-095e-4634-8975-e54bd7a0f0cb-combined-ca-bundle\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.191682 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xhs8\" (UniqueName: \"kubernetes.io/projected/8666ba67-095e-4634-8975-e54bd7a0f0cb-kube-api-access-8xhs8\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.191698 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8666ba67-095e-4634-8975-e54bd7a0f0cb-etc-swift\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.191727 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8666ba67-095e-4634-8975-e54bd7a0f0cb-log-httpd\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.191749 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8666ba67-095e-4634-8975-e54bd7a0f0cb-public-tls-certs\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.293575 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8666ba67-095e-4634-8975-e54bd7a0f0cb-internal-tls-certs\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.293629 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8666ba67-095e-4634-8975-e54bd7a0f0cb-run-httpd\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.293669 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8666ba67-095e-4634-8975-e54bd7a0f0cb-config-data\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.293689 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8666ba67-095e-4634-8975-e54bd7a0f0cb-combined-ca-bundle\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.293737 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xhs8\" (UniqueName: \"kubernetes.io/projected/8666ba67-095e-4634-8975-e54bd7a0f0cb-kube-api-access-8xhs8\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.293761 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8666ba67-095e-4634-8975-e54bd7a0f0cb-etc-swift\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.293805 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8666ba67-095e-4634-8975-e54bd7a0f0cb-log-httpd\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.293839 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8666ba67-095e-4634-8975-e54bd7a0f0cb-public-tls-certs\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.295823 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8666ba67-095e-4634-8975-e54bd7a0f0cb-log-httpd\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.296106 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8666ba67-095e-4634-8975-e54bd7a0f0cb-run-httpd\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.301130 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8666ba67-095e-4634-8975-e54bd7a0f0cb-public-tls-certs\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.301979 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8666ba67-095e-4634-8975-e54bd7a0f0cb-etc-swift\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.304603 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8666ba67-095e-4634-8975-e54bd7a0f0cb-config-data\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.311623 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8666ba67-095e-4634-8975-e54bd7a0f0cb-combined-ca-bundle\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.316553 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xhs8\" (UniqueName: \"kubernetes.io/projected/8666ba67-095e-4634-8975-e54bd7a0f0cb-kube-api-access-8xhs8\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.317530 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8666ba67-095e-4634-8975-e54bd7a0f0cb-internal-tls-certs\") pod \"swift-proxy-7c5b6578c5-9cvpk\" (UID: \"8666ba67-095e-4634-8975-e54bd7a0f0cb\") " pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:35 crc kubenswrapper[4781]: I1202 09:45:35.396219 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:37 crc kubenswrapper[4781]: I1202 09:45:37.975648 4781 generic.go:334] "Generic (PLEG): container finished" podID="27107564-9a0b-4a63-894d-4e0960abd435" containerID="218a370cd990992fc6c3374bc43b115828b635373940f9fb482c34d673ba42f9" exitCode=0 Dec 02 09:45:37 crc kubenswrapper[4781]: I1202 09:45:37.975716 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqlbj" event={"ID":"27107564-9a0b-4a63-894d-4e0960abd435","Type":"ContainerDied","Data":"218a370cd990992fc6c3374bc43b115828b635373940f9fb482c34d673ba42f9"} Dec 02 09:45:40 crc kubenswrapper[4781]: E1202 09:45:40.646133 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 218a370cd990992fc6c3374bc43b115828b635373940f9fb482c34d673ba42f9 is running failed: container process not found" containerID="218a370cd990992fc6c3374bc43b115828b635373940f9fb482c34d673ba42f9" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 09:45:40 crc kubenswrapper[4781]: E1202 09:45:40.646964 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 218a370cd990992fc6c3374bc43b115828b635373940f9fb482c34d673ba42f9 is running failed: container process not found" containerID="218a370cd990992fc6c3374bc43b115828b635373940f9fb482c34d673ba42f9" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 09:45:40 crc kubenswrapper[4781]: E1202 09:45:40.647360 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 218a370cd990992fc6c3374bc43b115828b635373940f9fb482c34d673ba42f9 is running failed: container process not found" containerID="218a370cd990992fc6c3374bc43b115828b635373940f9fb482c34d673ba42f9" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 09:45:40 crc kubenswrapper[4781]: E1202 09:45:40.647404 4781 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 218a370cd990992fc6c3374bc43b115828b635373940f9fb482c34d673ba42f9 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-hqlbj" podUID="27107564-9a0b-4a63-894d-4e0960abd435" containerName="registry-server" Dec 02 09:45:41 crc kubenswrapper[4781]: E1202 09:45:41.583346 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Dec 02 09:45:41 crc kubenswrapper[4781]: E1202 09:45:41.584208 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b8h65bh644h668h5f7h8fh58ch5f6h5dchd4h5c9h577h659h569hbdh9dh58dh694h569h688h5b6h5ddh5f9h569h584h5c4h77h66hdch688h549h64fq,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t4f2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(7a424309-12e4-42f9-ba35-d61f1f6c7b44): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 09:45:41 crc kubenswrapper[4781]: E1202 09:45:41.585451 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="7a424309-12e4-42f9-ba35-d61f1f6c7b44" Dec 02 09:45:41 crc kubenswrapper[4781]: I1202 09:45:41.931045 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqlbj" Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.007774 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1708d43-f8b6-491b-a365-a8542137dd44","Type":"ContainerStarted","Data":"431de54c5735d6c4333d70232eb87c7cbfc2630347bd6580c881a49ad249c0ba"} Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.007864 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1708d43-f8b6-491b-a365-a8542137dd44" containerName="ceilometer-central-agent" containerID="cri-o://5a8ee8bbbb0a8e89c8e39ea90569ff4a1fe26323f322f49fea79635b6c5afb47" gracePeriod=30 Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.007914 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.007941 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1708d43-f8b6-491b-a365-a8542137dd44" containerName="sg-core" containerID="cri-o://6983d16fa5c64f540318073c4f04f369f98ac3334fdf57fae6821d28b5547935" gracePeriod=30 Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.007956 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1708d43-f8b6-491b-a365-a8542137dd44" containerName="proxy-httpd" containerID="cri-o://431de54c5735d6c4333d70232eb87c7cbfc2630347bd6580c881a49ad249c0ba" gracePeriod=30 Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.007950 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1708d43-f8b6-491b-a365-a8542137dd44" containerName="ceilometer-notification-agent" containerID="cri-o://e83e542d40461a80472e7eaed39b3591f79be6d61ce515a902e3ca725927583f" gracePeriod=30 Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.015090 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqlbj" Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.015098 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqlbj" event={"ID":"27107564-9a0b-4a63-894d-4e0960abd435","Type":"ContainerDied","Data":"d0045191e9cda04f8a5348368d03b33c9219e972a56d6c7bb22939830eaee14f"} Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.015169 4781 scope.go:117] "RemoveContainer" containerID="218a370cd990992fc6c3374bc43b115828b635373940f9fb482c34d673ba42f9" Dec 02 09:45:42 crc kubenswrapper[4781]: E1202 09:45:42.016740 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="7a424309-12e4-42f9-ba35-d61f1f6c7b44" Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.033828 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.237897758 podStartE2EDuration="24.033805449s" podCreationTimestamp="2025-12-02 09:45:18 +0000 UTC" firstStartedPulling="2025-12-02 09:45:19.804045648 +0000 UTC m=+1482.627919527" lastFinishedPulling="2025-12-02 09:45:41.599953339 +0000 UTC m=+1504.423827218" observedRunningTime="2025-12-02 09:45:42.033420969 +0000 UTC m=+1504.857294848" watchObservedRunningTime="2025-12-02 09:45:42.033805449 +0000 UTC m=+1504.857679338" Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.041486 4781 scope.go:117] "RemoveContainer" containerID="cc118d6e0785fb3227a6994d0a9f17b361442a9718d55ec98de1715b16fb25bc" Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.071572 4781 scope.go:117] "RemoveContainer" containerID="944d592c4f3b5b0e74cd4f7ba1b76e8fb09d21d3fca2599dfb9aa627c26da805" Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.118204 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27107564-9a0b-4a63-894d-4e0960abd435-utilities\") pod \"27107564-9a0b-4a63-894d-4e0960abd435\" (UID: \"27107564-9a0b-4a63-894d-4e0960abd435\") " Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.118371 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nddw\" (UniqueName: \"kubernetes.io/projected/27107564-9a0b-4a63-894d-4e0960abd435-kube-api-access-2nddw\") pod \"27107564-9a0b-4a63-894d-4e0960abd435\" (UID: \"27107564-9a0b-4a63-894d-4e0960abd435\") " Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.118496 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27107564-9a0b-4a63-894d-4e0960abd435-catalog-content\") pod \"27107564-9a0b-4a63-894d-4e0960abd435\" (UID: \"27107564-9a0b-4a63-894d-4e0960abd435\") " Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.119419 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27107564-9a0b-4a63-894d-4e0960abd435-utilities" (OuterVolumeSpecName: "utilities") pod "27107564-9a0b-4a63-894d-4e0960abd435" (UID: "27107564-9a0b-4a63-894d-4e0960abd435"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.124158 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27107564-9a0b-4a63-894d-4e0960abd435-kube-api-access-2nddw" (OuterVolumeSpecName: "kube-api-access-2nddw") pod "27107564-9a0b-4a63-894d-4e0960abd435" (UID: "27107564-9a0b-4a63-894d-4e0960abd435"). InnerVolumeSpecName "kube-api-access-2nddw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.136077 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27107564-9a0b-4a63-894d-4e0960abd435-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27107564-9a0b-4a63-894d-4e0960abd435" (UID: "27107564-9a0b-4a63-894d-4e0960abd435"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.171875 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7c5b6578c5-9cvpk"] Dec 02 09:45:42 crc kubenswrapper[4781]: W1202 09:45:42.174740 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8666ba67_095e_4634_8975_e54bd7a0f0cb.slice/crio-5b18456be5e8a1bd41fa25b3cb2a37ca13eb8fa5d808e67b73b23c56fa9e2470 WatchSource:0}: Error finding container 5b18456be5e8a1bd41fa25b3cb2a37ca13eb8fa5d808e67b73b23c56fa9e2470: Status 404 returned error can't find the container with id 5b18456be5e8a1bd41fa25b3cb2a37ca13eb8fa5d808e67b73b23c56fa9e2470 Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.223964 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27107564-9a0b-4a63-894d-4e0960abd435-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.223994 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27107564-9a0b-4a63-894d-4e0960abd435-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.224008 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nddw\" (UniqueName: \"kubernetes.io/projected/27107564-9a0b-4a63-894d-4e0960abd435-kube-api-access-2nddw\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.548854 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqlbj"] Dec 02 09:45:42 crc kubenswrapper[4781]: I1202 09:45:42.558820 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqlbj"] Dec 02 09:45:43 crc kubenswrapper[4781]: I1202 09:45:43.028223 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7c5b6578c5-9cvpk" event={"ID":"8666ba67-095e-4634-8975-e54bd7a0f0cb","Type":"ContainerStarted","Data":"47ad629e28a462482c25f20c4df5ccf7c1cc566e2ab2f0bf953e659998612f04"} Dec 02 09:45:43 crc kubenswrapper[4781]: I1202 09:45:43.028289 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7c5b6578c5-9cvpk" event={"ID":"8666ba67-095e-4634-8975-e54bd7a0f0cb","Type":"ContainerStarted","Data":"bc7046cb302d27dcd649c03ed0f42b2a754c62808fc1418ff94265d6813be329"} Dec 02 09:45:43 crc kubenswrapper[4781]: I1202 09:45:43.028339 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7c5b6578c5-9cvpk" event={"ID":"8666ba67-095e-4634-8975-e54bd7a0f0cb","Type":"ContainerStarted","Data":"5b18456be5e8a1bd41fa25b3cb2a37ca13eb8fa5d808e67b73b23c56fa9e2470"} Dec 02 09:45:43 crc kubenswrapper[4781]: I1202 09:45:43.029545 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:43 crc kubenswrapper[4781]: I1202 09:45:43.029580 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:43 crc kubenswrapper[4781]: I1202 09:45:43.031495 4781 generic.go:334] "Generic (PLEG): container finished" podID="e1708d43-f8b6-491b-a365-a8542137dd44" containerID="6983d16fa5c64f540318073c4f04f369f98ac3334fdf57fae6821d28b5547935" exitCode=2 Dec 02 09:45:43 crc kubenswrapper[4781]: I1202 09:45:43.031516 4781 generic.go:334] "Generic (PLEG): container finished" podID="e1708d43-f8b6-491b-a365-a8542137dd44" containerID="e83e542d40461a80472e7eaed39b3591f79be6d61ce515a902e3ca725927583f" exitCode=0 Dec 02 09:45:43 crc kubenswrapper[4781]: I1202 09:45:43.031522 4781 generic.go:334] "Generic (PLEG): container finished" podID="e1708d43-f8b6-491b-a365-a8542137dd44" containerID="5a8ee8bbbb0a8e89c8e39ea90569ff4a1fe26323f322f49fea79635b6c5afb47" exitCode=0 Dec 02 09:45:43 crc kubenswrapper[4781]: I1202 09:45:43.031549 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1708d43-f8b6-491b-a365-a8542137dd44","Type":"ContainerDied","Data":"6983d16fa5c64f540318073c4f04f369f98ac3334fdf57fae6821d28b5547935"} Dec 02 09:45:43 crc kubenswrapper[4781]: I1202 09:45:43.031565 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1708d43-f8b6-491b-a365-a8542137dd44","Type":"ContainerDied","Data":"e83e542d40461a80472e7eaed39b3591f79be6d61ce515a902e3ca725927583f"} Dec 02 09:45:43 crc kubenswrapper[4781]: I1202 09:45:43.031574 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1708d43-f8b6-491b-a365-a8542137dd44","Type":"ContainerDied","Data":"5a8ee8bbbb0a8e89c8e39ea90569ff4a1fe26323f322f49fea79635b6c5afb47"} Dec 02 09:45:43 crc kubenswrapper[4781]: I1202 09:45:43.056612 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7c5b6578c5-9cvpk" podStartSLOduration=8.056589443 podStartE2EDuration="8.056589443s" podCreationTimestamp="2025-12-02 09:45:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:45:43.049976728 +0000 UTC m=+1505.873850627" watchObservedRunningTime="2025-12-02 09:45:43.056589443 +0000 UTC m=+1505.880463322" Dec 02 09:45:43 crc kubenswrapper[4781]: I1202 09:45:43.509334 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27107564-9a0b-4a63-894d-4e0960abd435" path="/var/lib/kubelet/pods/27107564-9a0b-4a63-894d-4e0960abd435/volumes" Dec 02 09:45:43 crc kubenswrapper[4781]: I1202 09:45:43.992257 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67cff9c6-hg2l6" podUID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.088125 4781 generic.go:334] "Generic (PLEG): container finished" podID="51f721e2-e8fd-4ae1-89d9-fe7272e8246e" containerID="a07f2009cf58fc528ef5769303b895ed34faf022ea7551f54cb14cce5a190499" exitCode=0 Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.088347 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-br5j5" event={"ID":"51f721e2-e8fd-4ae1-89d9-fe7272e8246e","Type":"ContainerDied","Data":"a07f2009cf58fc528ef5769303b895ed34faf022ea7551f54cb14cce5a190499"} Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.098745 4781 generic.go:334] "Generic (PLEG): container finished" podID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" containerID="48b12156148830c711ba4ea9c2a408455c26ae2b9baec0f6f03c6f3d2a1699e3" exitCode=137 Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.098956 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67cff9c6-hg2l6" event={"ID":"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e","Type":"ContainerDied","Data":"48b12156148830c711ba4ea9c2a408455c26ae2b9baec0f6f03c6f3d2a1699e3"} Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.377113 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.538506 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-logs\") pod \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.538839 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-horizon-secret-key\") pod \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.538859 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-config-data\") pod \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.538894 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-combined-ca-bundle\") pod \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.539088 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-logs" (OuterVolumeSpecName: "logs") pod "7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" (UID: "7dcfdf79-19b4-4ea4-9b16-67d79aa7165e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.539724 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-scripts\") pod \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.539763 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-horizon-tls-certs\") pod \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.539839 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p44t6\" (UniqueName: \"kubernetes.io/projected/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-kube-api-access-p44t6\") pod \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\" (UID: \"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e\") " Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.540260 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.545648 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" (UID: "7dcfdf79-19b4-4ea4-9b16-67d79aa7165e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.545770 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-kube-api-access-p44t6" (OuterVolumeSpecName: "kube-api-access-p44t6") pod "7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" (UID: "7dcfdf79-19b4-4ea4-9b16-67d79aa7165e"). InnerVolumeSpecName "kube-api-access-p44t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.569264 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-config-data" (OuterVolumeSpecName: "config-data") pod "7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" (UID: "7dcfdf79-19b4-4ea4-9b16-67d79aa7165e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.570668 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-scripts" (OuterVolumeSpecName: "scripts") pod "7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" (UID: "7dcfdf79-19b4-4ea4-9b16-67d79aa7165e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.572839 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" (UID: "7dcfdf79-19b4-4ea4-9b16-67d79aa7165e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.596279 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" (UID: "7dcfdf79-19b4-4ea4-9b16-67d79aa7165e"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.642059 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.642104 4781 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.642119 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p44t6\" (UniqueName: \"kubernetes.io/projected/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-kube-api-access-p44t6\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.642132 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.642144 4781 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:48 crc kubenswrapper[4781]: I1202 09:45:48.642155 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.108593 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67cff9c6-hg2l6" event={"ID":"7dcfdf79-19b4-4ea4-9b16-67d79aa7165e","Type":"ContainerDied","Data":"1ebbd24bae131584f2350f9b6736510ed7051f24c06f451acfb3476d5c35fd15"} Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.108639 4781 scope.go:117] "RemoveContainer" containerID="e4f8d80de8d7fdd7aed53bd0df8818adc65f193e5a52808c1a72181a16906549" Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.108641 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67cff9c6-hg2l6" Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.112736 4781 generic.go:334] "Generic (PLEG): container finished" podID="9418aedd-84eb-4ff9-89f8-831695e5471e" containerID="9dd76875a01226364271269d23abccb788961de68d9af76565e733a9a1496d2d" exitCode=0 Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.112870 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bq87j" event={"ID":"9418aedd-84eb-4ff9-89f8-831695e5471e","Type":"ContainerDied","Data":"9dd76875a01226364271269d23abccb788961de68d9af76565e733a9a1496d2d"} Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.166715 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67cff9c6-hg2l6"] Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.176973 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-67cff9c6-hg2l6"] Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.286534 4781 scope.go:117] "RemoveContainer" containerID="48b12156148830c711ba4ea9c2a408455c26ae2b9baec0f6f03c6f3d2a1699e3" Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.428584 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-br5j5" Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.511515 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" path="/var/lib/kubelet/pods/7dcfdf79-19b4-4ea4-9b16-67d79aa7165e/volumes" Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.559757 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-db-sync-config-data\") pod \"51f721e2-e8fd-4ae1-89d9-fe7272e8246e\" (UID: \"51f721e2-e8fd-4ae1-89d9-fe7272e8246e\") " Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.559947 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-combined-ca-bundle\") pod \"51f721e2-e8fd-4ae1-89d9-fe7272e8246e\" (UID: \"51f721e2-e8fd-4ae1-89d9-fe7272e8246e\") " Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.560115 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq9zt\" (UniqueName: \"kubernetes.io/projected/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-kube-api-access-vq9zt\") pod \"51f721e2-e8fd-4ae1-89d9-fe7272e8246e\" (UID: \"51f721e2-e8fd-4ae1-89d9-fe7272e8246e\") " Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.579971 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "51f721e2-e8fd-4ae1-89d9-fe7272e8246e" (UID: "51f721e2-e8fd-4ae1-89d9-fe7272e8246e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.580022 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-kube-api-access-vq9zt" (OuterVolumeSpecName: "kube-api-access-vq9zt") pod "51f721e2-e8fd-4ae1-89d9-fe7272e8246e" (UID: "51f721e2-e8fd-4ae1-89d9-fe7272e8246e"). InnerVolumeSpecName "kube-api-access-vq9zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.585241 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51f721e2-e8fd-4ae1-89d9-fe7272e8246e" (UID: "51f721e2-e8fd-4ae1-89d9-fe7272e8246e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.662239 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.662276 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq9zt\" (UniqueName: \"kubernetes.io/projected/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-kube-api-access-vq9zt\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:49 crc kubenswrapper[4781]: I1202 09:45:49.662287 4781 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51f721e2-e8fd-4ae1-89d9-fe7272e8246e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.121181 4781 generic.go:334] "Generic (PLEG): container finished" podID="c8e968b8-6eb0-41d7-beed-8b2bf7006359" containerID="8bafab971d14d7d06b2d33414c80695d46381f1033247ba6a70d9c9f4e3f08d9" exitCode=0 Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.121283 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mfnct" event={"ID":"c8e968b8-6eb0-41d7-beed-8b2bf7006359","Type":"ContainerDied","Data":"8bafab971d14d7d06b2d33414c80695d46381f1033247ba6a70d9c9f4e3f08d9"} Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.123372 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-br5j5" event={"ID":"51f721e2-e8fd-4ae1-89d9-fe7272e8246e","Type":"ContainerDied","Data":"a15abac7b995faeb5523e72dc55c2a15e6a85dcbb152674154c9d0429a767f15"} Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.123415 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a15abac7b995faeb5523e72dc55c2a15e6a85dcbb152674154c9d0429a767f15" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.123417 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-br5j5" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.363816 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-65bbfbf867-4wxpf"] Dec 02 09:45:50 crc kubenswrapper[4781]: E1202 09:45:50.364359 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27107564-9a0b-4a63-894d-4e0960abd435" containerName="registry-server" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.364377 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="27107564-9a0b-4a63-894d-4e0960abd435" containerName="registry-server" Dec 02 09:45:50 crc kubenswrapper[4781]: E1202 09:45:50.364397 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27107564-9a0b-4a63-894d-4e0960abd435" containerName="extract-utilities" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.364406 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="27107564-9a0b-4a63-894d-4e0960abd435" containerName="extract-utilities" Dec 02 09:45:50 crc kubenswrapper[4781]: E1202 09:45:50.364434 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" containerName="horizon-log" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.364443 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" containerName="horizon-log" Dec 02 09:45:50 crc kubenswrapper[4781]: E1202 09:45:50.364461 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27107564-9a0b-4a63-894d-4e0960abd435" containerName="extract-content" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.364471 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="27107564-9a0b-4a63-894d-4e0960abd435" containerName="extract-content" Dec 02 09:45:50 crc kubenswrapper[4781]: E1202 09:45:50.364487 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f721e2-e8fd-4ae1-89d9-fe7272e8246e" containerName="barbican-db-sync" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.364496 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f721e2-e8fd-4ae1-89d9-fe7272e8246e" containerName="barbican-db-sync" Dec 02 09:45:50 crc kubenswrapper[4781]: E1202 09:45:50.364510 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" containerName="horizon" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.364519 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" containerName="horizon" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.364742 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f721e2-e8fd-4ae1-89d9-fe7272e8246e" containerName="barbican-db-sync" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.364761 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="27107564-9a0b-4a63-894d-4e0960abd435" containerName="registry-server" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.364790 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" containerName="horizon-log" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.364808 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dcfdf79-19b4-4ea4-9b16-67d79aa7165e" containerName="horizon" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.366057 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65bbfbf867-4wxpf" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.373843 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8phcb" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.374178 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.374442 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.380183 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5d776c757d-qgkmw"] Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.382020 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.387877 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.398244 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65bbfbf867-4wxpf"] Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.416345 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d776c757d-qgkmw"] Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.475202 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1-config-data-custom\") pod \"barbican-keystone-listener-5d776c757d-qgkmw\" (UID: \"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1\") " pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.475276 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-557mf\" (UniqueName: \"kubernetes.io/projected/f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1-kube-api-access-557mf\") pod \"barbican-keystone-listener-5d776c757d-qgkmw\" (UID: \"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1\") " pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.475335 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bac97a41-a2f4-46ee-b48c-216aeee03abc-config-data-custom\") pod \"barbican-worker-65bbfbf867-4wxpf\" (UID: \"bac97a41-a2f4-46ee-b48c-216aeee03abc\") " pod="openstack/barbican-worker-65bbfbf867-4wxpf" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.475363 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1-logs\") pod \"barbican-keystone-listener-5d776c757d-qgkmw\" (UID: \"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1\") " pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.475388 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac97a41-a2f4-46ee-b48c-216aeee03abc-config-data\") pod \"barbican-worker-65bbfbf867-4wxpf\" (UID: \"bac97a41-a2f4-46ee-b48c-216aeee03abc\") " pod="openstack/barbican-worker-65bbfbf867-4wxpf" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.475431 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bac97a41-a2f4-46ee-b48c-216aeee03abc-combined-ca-bundle\") pod \"barbican-worker-65bbfbf867-4wxpf\" (UID: \"bac97a41-a2f4-46ee-b48c-216aeee03abc\") " pod="openstack/barbican-worker-65bbfbf867-4wxpf" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.475469 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1-combined-ca-bundle\") pod \"barbican-keystone-listener-5d776c757d-qgkmw\" (UID: \"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1\") " pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.475502 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlx2n\" (UniqueName: \"kubernetes.io/projected/bac97a41-a2f4-46ee-b48c-216aeee03abc-kube-api-access-jlx2n\") pod \"barbican-worker-65bbfbf867-4wxpf\" (UID: \"bac97a41-a2f4-46ee-b48c-216aeee03abc\") " pod="openstack/barbican-worker-65bbfbf867-4wxpf" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.475530 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bac97a41-a2f4-46ee-b48c-216aeee03abc-logs\") pod \"barbican-worker-65bbfbf867-4wxpf\" (UID: \"bac97a41-a2f4-46ee-b48c-216aeee03abc\") " pod="openstack/barbican-worker-65bbfbf867-4wxpf" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.475565 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1-config-data\") pod \"barbican-keystone-listener-5d776c757d-qgkmw\" (UID: \"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1\") " pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.498507 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.510603 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7c5b6578c5-9cvpk" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.516176 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-p6vn9"] Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.521346 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.537989 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bq87j" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.544025 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-p6vn9"] Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.609288 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-557mf\" (UniqueName: \"kubernetes.io/projected/f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1-kube-api-access-557mf\") pod \"barbican-keystone-listener-5d776c757d-qgkmw\" (UID: \"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1\") " pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.609389 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bac97a41-a2f4-46ee-b48c-216aeee03abc-config-data-custom\") pod \"barbican-worker-65bbfbf867-4wxpf\" (UID: \"bac97a41-a2f4-46ee-b48c-216aeee03abc\") " pod="openstack/barbican-worker-65bbfbf867-4wxpf" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.609418 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1-logs\") pod \"barbican-keystone-listener-5d776c757d-qgkmw\" (UID: \"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1\") " pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.609442 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac97a41-a2f4-46ee-b48c-216aeee03abc-config-data\") pod \"barbican-worker-65bbfbf867-4wxpf\" (UID: \"bac97a41-a2f4-46ee-b48c-216aeee03abc\") " pod="openstack/barbican-worker-65bbfbf867-4wxpf" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.609573 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bac97a41-a2f4-46ee-b48c-216aeee03abc-combined-ca-bundle\") pod \"barbican-worker-65bbfbf867-4wxpf\" (UID: \"bac97a41-a2f4-46ee-b48c-216aeee03abc\") " pod="openstack/barbican-worker-65bbfbf867-4wxpf" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.609608 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1-combined-ca-bundle\") pod \"barbican-keystone-listener-5d776c757d-qgkmw\" (UID: \"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1\") " pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.609641 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlx2n\" (UniqueName: \"kubernetes.io/projected/bac97a41-a2f4-46ee-b48c-216aeee03abc-kube-api-access-jlx2n\") pod \"barbican-worker-65bbfbf867-4wxpf\" (UID: \"bac97a41-a2f4-46ee-b48c-216aeee03abc\") " pod="openstack/barbican-worker-65bbfbf867-4wxpf" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.609669 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bac97a41-a2f4-46ee-b48c-216aeee03abc-logs\") pod \"barbican-worker-65bbfbf867-4wxpf\" (UID: \"bac97a41-a2f4-46ee-b48c-216aeee03abc\") " pod="openstack/barbican-worker-65bbfbf867-4wxpf" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.609717 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1-config-data\") pod \"barbican-keystone-listener-5d776c757d-qgkmw\" (UID: \"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1\") " pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.609761 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1-config-data-custom\") pod \"barbican-keystone-listener-5d776c757d-qgkmw\" (UID: \"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1\") " pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.618615 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bac97a41-a2f4-46ee-b48c-216aeee03abc-logs\") pod \"barbican-worker-65bbfbf867-4wxpf\" (UID: \"bac97a41-a2f4-46ee-b48c-216aeee03abc\") " pod="openstack/barbican-worker-65bbfbf867-4wxpf" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.619869 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1-config-data-custom\") pod \"barbican-keystone-listener-5d776c757d-qgkmw\" (UID: \"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1\") " pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.620031 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1-logs\") pod \"barbican-keystone-listener-5d776c757d-qgkmw\" (UID: \"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1\") " pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.625956 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1-config-data\") pod \"barbican-keystone-listener-5d776c757d-qgkmw\" (UID: \"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1\") " pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.626489 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1-combined-ca-bundle\") pod \"barbican-keystone-listener-5d776c757d-qgkmw\" (UID: \"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1\") " pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.628413 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bac97a41-a2f4-46ee-b48c-216aeee03abc-combined-ca-bundle\") pod \"barbican-worker-65bbfbf867-4wxpf\" (UID: \"bac97a41-a2f4-46ee-b48c-216aeee03abc\") " pod="openstack/barbican-worker-65bbfbf867-4wxpf" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.629780 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bac97a41-a2f4-46ee-b48c-216aeee03abc-config-data-custom\") pod \"barbican-worker-65bbfbf867-4wxpf\" (UID: \"bac97a41-a2f4-46ee-b48c-216aeee03abc\") " pod="openstack/barbican-worker-65bbfbf867-4wxpf" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.632778 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-557mf\" (UniqueName: \"kubernetes.io/projected/f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1-kube-api-access-557mf\") pod \"barbican-keystone-listener-5d776c757d-qgkmw\" (UID: \"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1\") " pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.656658 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac97a41-a2f4-46ee-b48c-216aeee03abc-config-data\") pod \"barbican-worker-65bbfbf867-4wxpf\" (UID: \"bac97a41-a2f4-46ee-b48c-216aeee03abc\") " pod="openstack/barbican-worker-65bbfbf867-4wxpf" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.663300 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlx2n\" (UniqueName: \"kubernetes.io/projected/bac97a41-a2f4-46ee-b48c-216aeee03abc-kube-api-access-jlx2n\") pod \"barbican-worker-65bbfbf867-4wxpf\" (UID: \"bac97a41-a2f4-46ee-b48c-216aeee03abc\") " pod="openstack/barbican-worker-65bbfbf867-4wxpf" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.711447 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9418aedd-84eb-4ff9-89f8-831695e5471e-etc-machine-id\") pod \"9418aedd-84eb-4ff9-89f8-831695e5471e\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.711623 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-scripts\") pod \"9418aedd-84eb-4ff9-89f8-831695e5471e\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.711723 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-db-sync-config-data\") pod \"9418aedd-84eb-4ff9-89f8-831695e5471e\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.711870 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-combined-ca-bundle\") pod \"9418aedd-84eb-4ff9-89f8-831695e5471e\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.711913 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-config-data\") pod \"9418aedd-84eb-4ff9-89f8-831695e5471e\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.712121 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4xvv\" (UniqueName: \"kubernetes.io/projected/9418aedd-84eb-4ff9-89f8-831695e5471e-kube-api-access-h4xvv\") pod \"9418aedd-84eb-4ff9-89f8-831695e5471e\" (UID: \"9418aedd-84eb-4ff9-89f8-831695e5471e\") " Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.712628 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.712745 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-config\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.713030 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.713091 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.713175 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.713303 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrc4h\" (UniqueName: \"kubernetes.io/projected/945f0634-f5c5-4471-abab-384d1c2a140c-kube-api-access-rrc4h\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.713635 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9418aedd-84eb-4ff9-89f8-831695e5471e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9418aedd-84eb-4ff9-89f8-831695e5471e" (UID: "9418aedd-84eb-4ff9-89f8-831695e5471e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.720451 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56b57b86b8-kf8n9"] Dec 02 09:45:50 crc kubenswrapper[4781]: E1202 09:45:50.720882 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9418aedd-84eb-4ff9-89f8-831695e5471e" containerName="cinder-db-sync" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.720896 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9418aedd-84eb-4ff9-89f8-831695e5471e" containerName="cinder-db-sync" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.721123 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9418aedd-84eb-4ff9-89f8-831695e5471e" containerName="cinder-db-sync" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.722234 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.723028 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9418aedd-84eb-4ff9-89f8-831695e5471e" (UID: "9418aedd-84eb-4ff9-89f8-831695e5471e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.729913 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.737308 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-scripts" (OuterVolumeSpecName: "scripts") pod "9418aedd-84eb-4ff9-89f8-831695e5471e" (UID: "9418aedd-84eb-4ff9-89f8-831695e5471e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.739145 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9418aedd-84eb-4ff9-89f8-831695e5471e-kube-api-access-h4xvv" (OuterVolumeSpecName: "kube-api-access-h4xvv") pod "9418aedd-84eb-4ff9-89f8-831695e5471e" (UID: "9418aedd-84eb-4ff9-89f8-831695e5471e"). InnerVolumeSpecName "kube-api-access-h4xvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.761072 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9418aedd-84eb-4ff9-89f8-831695e5471e" (UID: "9418aedd-84eb-4ff9-89f8-831695e5471e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.772007 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56b57b86b8-kf8n9"] Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.803270 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-config-data" (OuterVolumeSpecName: "config-data") pod "9418aedd-84eb-4ff9-89f8-831695e5471e" (UID: "9418aedd-84eb-4ff9-89f8-831695e5471e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.815129 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw5b2\" (UniqueName: \"kubernetes.io/projected/22badd1d-b000-4870-86cd-1186fe2ed67d-kube-api-access-gw5b2\") pod \"barbican-api-56b57b86b8-kf8n9\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.815896 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-config-data-custom\") pod \"barbican-api-56b57b86b8-kf8n9\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.816031 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22badd1d-b000-4870-86cd-1186fe2ed67d-logs\") pod \"barbican-api-56b57b86b8-kf8n9\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.816084 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-combined-ca-bundle\") pod \"barbican-api-56b57b86b8-kf8n9\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.816173 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.816194 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.816258 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.816319 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrc4h\" (UniqueName: \"kubernetes.io/projected/945f0634-f5c5-4471-abab-384d1c2a140c-kube-api-access-rrc4h\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.816353 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.816402 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-config-data\") pod \"barbican-api-56b57b86b8-kf8n9\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.816427 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-config\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.816501 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.816513 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.816544 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4xvv\" (UniqueName: \"kubernetes.io/projected/9418aedd-84eb-4ff9-89f8-831695e5471e-kube-api-access-h4xvv\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.816559 4781 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9418aedd-84eb-4ff9-89f8-831695e5471e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.816570 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.816581 4781 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9418aedd-84eb-4ff9-89f8-831695e5471e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.818260 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-config\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.819110 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.820771 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.821555 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.822366 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.822982 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65bbfbf867-4wxpf" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.834797 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.841970 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrc4h\" (UniqueName: \"kubernetes.io/projected/945f0634-f5c5-4471-abab-384d1c2a140c-kube-api-access-rrc4h\") pod \"dnsmasq-dns-586bdc5f9-p6vn9\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.879119 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.919007 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-config-data\") pod \"barbican-api-56b57b86b8-kf8n9\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.919092 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw5b2\" (UniqueName: \"kubernetes.io/projected/22badd1d-b000-4870-86cd-1186fe2ed67d-kube-api-access-gw5b2\") pod \"barbican-api-56b57b86b8-kf8n9\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.919144 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-config-data-custom\") pod \"barbican-api-56b57b86b8-kf8n9\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.919200 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22badd1d-b000-4870-86cd-1186fe2ed67d-logs\") pod \"barbican-api-56b57b86b8-kf8n9\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.919235 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-combined-ca-bundle\") pod \"barbican-api-56b57b86b8-kf8n9\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.921164 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22badd1d-b000-4870-86cd-1186fe2ed67d-logs\") pod \"barbican-api-56b57b86b8-kf8n9\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.925502 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-combined-ca-bundle\") pod \"barbican-api-56b57b86b8-kf8n9\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.925936 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-config-data-custom\") pod \"barbican-api-56b57b86b8-kf8n9\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.927459 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-config-data\") pod \"barbican-api-56b57b86b8-kf8n9\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:50 crc kubenswrapper[4781]: I1202 09:45:50.945477 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw5b2\" (UniqueName: \"kubernetes.io/projected/22badd1d-b000-4870-86cd-1186fe2ed67d-kube-api-access-gw5b2\") pod \"barbican-api-56b57b86b8-kf8n9\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.097943 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.148705 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bq87j" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.148728 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bq87j" event={"ID":"9418aedd-84eb-4ff9-89f8-831695e5471e","Type":"ContainerDied","Data":"73cfe805e9a33f292e6d11a608d52bd590a3b4abc1f956f129f0e2fce57f5403"} Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.148775 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73cfe805e9a33f292e6d11a608d52bd590a3b4abc1f956f129f0e2fce57f5403" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.414722 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-p6vn9"] Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.421848 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65bbfbf867-4wxpf"] Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.548615 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d776c757d-qgkmw"] Dec 02 09:45:51 crc kubenswrapper[4781]: E1202 09:45:51.648980 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9418aedd_84eb_4ff9_89f8_831695e5471e.slice/crio-73cfe805e9a33f292e6d11a608d52bd590a3b4abc1f956f129f0e2fce57f5403\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9418aedd_84eb_4ff9_89f8_831695e5471e.slice\": RecentStats: unable to find data in memory cache]" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.660031 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.662603 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.673280 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.673547 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.673886 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-w4npc" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.674034 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.691148 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.783168 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-p6vn9"] Dec 02 09:45:51 crc kubenswrapper[4781]: W1202 09:45:51.821689 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22badd1d_b000_4870_86cd_1186fe2ed67d.slice/crio-69ee68c78ba6cbebf19b9143ef657384c5681395dafd1bd3dd6036a866728f06 WatchSource:0}: Error finding container 69ee68c78ba6cbebf19b9143ef657384c5681395dafd1bd3dd6036a866728f06: Status 404 returned error can't find the container with id 69ee68c78ba6cbebf19b9143ef657384c5681395dafd1bd3dd6036a866728f06 Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.824468 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56b57b86b8-kf8n9"] Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.837046 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-ddg7b"] Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.841570 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.843646 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-ddg7b"] Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.865917 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-config-data\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.873744 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-scripts\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.874094 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.874142 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.874298 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2927c2fe-9023-4be6-a44a-6701f24c499a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.879742 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrsc7\" (UniqueName: \"kubernetes.io/projected/2927c2fe-9023-4be6-a44a-6701f24c499a-kube-api-access-zrsc7\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.916950 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mfnct" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.919034 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 09:45:51 crc kubenswrapper[4781]: E1202 09:45:51.919513 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8e968b8-6eb0-41d7-beed-8b2bf7006359" containerName="neutron-db-sync" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.919536 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e968b8-6eb0-41d7-beed-8b2bf7006359" containerName="neutron-db-sync" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.919744 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8e968b8-6eb0-41d7-beed-8b2bf7006359" containerName="neutron-db-sync" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.923854 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.927730 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.976692 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.984046 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x659g\" (UniqueName: \"kubernetes.io/projected/0b26b352-c249-411c-805f-8563bc99074a-kube-api-access-x659g\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.984148 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrsc7\" (UniqueName: \"kubernetes.io/projected/2927c2fe-9023-4be6-a44a-6701f24c499a-kube-api-access-zrsc7\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.984210 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.984243 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-config-data\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.984293 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-scripts\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.984360 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.984456 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.984506 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.984543 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.984586 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.984627 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-config\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.984654 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2927c2fe-9023-4be6-a44a-6701f24c499a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.984836 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2927c2fe-9023-4be6-a44a-6701f24c499a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.995470 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-config-data\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:51 crc kubenswrapper[4781]: I1202 09:45:51.997957 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.002100 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-scripts\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.011622 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.013132 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrsc7\" (UniqueName: \"kubernetes.io/projected/2927c2fe-9023-4be6-a44a-6701f24c499a-kube-api-access-zrsc7\") pod \"cinder-scheduler-0\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " pod="openstack/cinder-scheduler-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.046727 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.086214 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e968b8-6eb0-41d7-beed-8b2bf7006359-combined-ca-bundle\") pod \"c8e968b8-6eb0-41d7-beed-8b2bf7006359\" (UID: \"c8e968b8-6eb0-41d7-beed-8b2bf7006359\") " Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.086442 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8e968b8-6eb0-41d7-beed-8b2bf7006359-config\") pod \"c8e968b8-6eb0-41d7-beed-8b2bf7006359\" (UID: \"c8e968b8-6eb0-41d7-beed-8b2bf7006359\") " Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.086519 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgk85\" (UniqueName: \"kubernetes.io/projected/c8e968b8-6eb0-41d7-beed-8b2bf7006359-kube-api-access-wgk85\") pod \"c8e968b8-6eb0-41d7-beed-8b2bf7006359\" (UID: \"c8e968b8-6eb0-41d7-beed-8b2bf7006359\") " Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.086882 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x659g\" (UniqueName: \"kubernetes.io/projected/0b26b352-c249-411c-805f-8563bc99074a-kube-api-access-x659g\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.086955 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.087006 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-scripts\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.087034 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.087091 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-config-data-custom\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.087126 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.087198 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b315a4b-f4f2-4066-a945-f168e5f76f49-logs\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.087261 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fqsl\" (UniqueName: \"kubernetes.io/projected/0b315a4b-f4f2-4066-a945-f168e5f76f49-kube-api-access-6fqsl\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.087314 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.087346 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.087397 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-config\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.087426 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-config-data\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.087663 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b315a4b-f4f2-4066-a945-f168e5f76f49-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.090837 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.091389 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.092062 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-config\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.092698 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.099521 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8e968b8-6eb0-41d7-beed-8b2bf7006359-kube-api-access-wgk85" (OuterVolumeSpecName: "kube-api-access-wgk85") pod "c8e968b8-6eb0-41d7-beed-8b2bf7006359" (UID: "c8e968b8-6eb0-41d7-beed-8b2bf7006359"). InnerVolumeSpecName "kube-api-access-wgk85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.100833 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.124437 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x659g\" (UniqueName: \"kubernetes.io/projected/0b26b352-c249-411c-805f-8563bc99074a-kube-api-access-x659g\") pod \"dnsmasq-dns-795f4db4bc-ddg7b\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.124664 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8e968b8-6eb0-41d7-beed-8b2bf7006359-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8e968b8-6eb0-41d7-beed-8b2bf7006359" (UID: "c8e968b8-6eb0-41d7-beed-8b2bf7006359"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.135016 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8e968b8-6eb0-41d7-beed-8b2bf7006359-config" (OuterVolumeSpecName: "config") pod "c8e968b8-6eb0-41d7-beed-8b2bf7006359" (UID: "c8e968b8-6eb0-41d7-beed-8b2bf7006359"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.166452 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" event={"ID":"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1","Type":"ContainerStarted","Data":"b010733d692a895be3e8e35f56a9a949ceab22967f43b5d831c4faf7c05a3f10"} Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.183428 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mfnct" event={"ID":"c8e968b8-6eb0-41d7-beed-8b2bf7006359","Type":"ContainerDied","Data":"450668e11f874cde123e0ecc7db9220f415c2f862de9d544964ac18f52537166"} Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.183470 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="450668e11f874cde123e0ecc7db9220f415c2f862de9d544964ac18f52537166" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.183614 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mfnct" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.189198 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b57b86b8-kf8n9" event={"ID":"22badd1d-b000-4870-86cd-1186fe2ed67d","Type":"ContainerStarted","Data":"69ee68c78ba6cbebf19b9143ef657384c5681395dafd1bd3dd6036a866728f06"} Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.189260 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-config-data\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.189819 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b315a4b-f4f2-4066-a945-f168e5f76f49-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.189959 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b315a4b-f4f2-4066-a945-f168e5f76f49-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.190066 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.190208 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-scripts\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.190374 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65bbfbf867-4wxpf" event={"ID":"bac97a41-a2f4-46ee-b48c-216aeee03abc","Type":"ContainerStarted","Data":"ef30fb2205646c2639528d61e7f19600f37980c6f959d06de735300c57e15798"} Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.190525 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-config-data-custom\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.190775 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b315a4b-f4f2-4066-a945-f168e5f76f49-logs\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.190951 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fqsl\" (UniqueName: \"kubernetes.io/projected/0b315a4b-f4f2-4066-a945-f168e5f76f49-kube-api-access-6fqsl\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.191164 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e968b8-6eb0-41d7-beed-8b2bf7006359-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.191264 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8e968b8-6eb0-41d7-beed-8b2bf7006359-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.191346 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgk85\" (UniqueName: \"kubernetes.io/projected/c8e968b8-6eb0-41d7-beed-8b2bf7006359-kube-api-access-wgk85\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.192321 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b315a4b-f4f2-4066-a945-f168e5f76f49-logs\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.192645 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.193149 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" event={"ID":"945f0634-f5c5-4471-abab-384d1c2a140c","Type":"ContainerStarted","Data":"9212d7d8ff55ba929c2e769cabdfc04fd8a31b0941647d2b79da1caef8fbedc8"} Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.193433 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.200731 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-scripts\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.205141 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-config-data\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.209975 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-config-data-custom\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.212217 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fqsl\" (UniqueName: \"kubernetes.io/projected/0b315a4b-f4f2-4066-a945-f168e5f76f49-kube-api-access-6fqsl\") pod \"cinder-api-0\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.276277 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.363727 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-ddg7b"] Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.382248 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lqtxt"] Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.384030 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.398375 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lqtxt"] Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.496760 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.497061 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwgpg\" (UniqueName: \"kubernetes.io/projected/4007cb57-06af-4119-bf9e-a2e2aa8509cd-kube-api-access-pwgpg\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.497118 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.497141 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.497166 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.497189 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-config\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.562568 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.598984 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.599029 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwgpg\" (UniqueName: \"kubernetes.io/projected/4007cb57-06af-4119-bf9e-a2e2aa8509cd-kube-api-access-pwgpg\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.599103 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.599124 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.599151 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.599183 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-config\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.600381 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.600570 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.601082 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-config\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.604235 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.604773 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.618188 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwgpg\" (UniqueName: \"kubernetes.io/projected/4007cb57-06af-4119-bf9e-a2e2aa8509cd-kube-api-access-pwgpg\") pod \"dnsmasq-dns-5c9776ccc5-lqtxt\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.702086 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-77d7d4d6c6-nc7g8"] Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.704012 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.707304 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.714662 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.714835 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ln2qp" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.714960 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.732052 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77d7d4d6c6-nc7g8"] Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.774436 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.806941 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-config\") pod \"neutron-77d7d4d6c6-nc7g8\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.807002 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp9f2\" (UniqueName: \"kubernetes.io/projected/1b2440b7-d6a5-426e-ac38-2f5fd531935b-kube-api-access-pp9f2\") pod \"neutron-77d7d4d6c6-nc7g8\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.807033 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-ovndb-tls-certs\") pod \"neutron-77d7d4d6c6-nc7g8\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.807062 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-combined-ca-bundle\") pod \"neutron-77d7d4d6c6-nc7g8\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.807106 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-httpd-config\") pod \"neutron-77d7d4d6c6-nc7g8\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.848731 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-ddg7b"] Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.908341 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-httpd-config\") pod \"neutron-77d7d4d6c6-nc7g8\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.908785 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-config\") pod \"neutron-77d7d4d6c6-nc7g8\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.908827 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp9f2\" (UniqueName: \"kubernetes.io/projected/1b2440b7-d6a5-426e-ac38-2f5fd531935b-kube-api-access-pp9f2\") pod \"neutron-77d7d4d6c6-nc7g8\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.908848 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-ovndb-tls-certs\") pod \"neutron-77d7d4d6c6-nc7g8\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.908873 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-combined-ca-bundle\") pod \"neutron-77d7d4d6c6-nc7g8\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.918640 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-httpd-config\") pod \"neutron-77d7d4d6c6-nc7g8\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.919072 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-ovndb-tls-certs\") pod \"neutron-77d7d4d6c6-nc7g8\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.919147 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-config\") pod \"neutron-77d7d4d6c6-nc7g8\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.926381 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-combined-ca-bundle\") pod \"neutron-77d7d4d6c6-nc7g8\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:52 crc kubenswrapper[4781]: I1202 09:45:52.930160 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp9f2\" (UniqueName: \"kubernetes.io/projected/1b2440b7-d6a5-426e-ac38-2f5fd531935b-kube-api-access-pp9f2\") pod \"neutron-77d7d4d6c6-nc7g8\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:53 crc kubenswrapper[4781]: I1202 09:45:53.017725 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 09:45:53 crc kubenswrapper[4781]: I1202 09:45:53.029300 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:53 crc kubenswrapper[4781]: I1202 09:45:53.217022 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" event={"ID":"0b26b352-c249-411c-805f-8563bc99074a","Type":"ContainerStarted","Data":"80e252f45073e012320cb6f6c16aa8413825ddb931b440b908cbe8913d0b01eb"} Dec 02 09:45:53 crc kubenswrapper[4781]: I1202 09:45:53.222944 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2927c2fe-9023-4be6-a44a-6701f24c499a","Type":"ContainerStarted","Data":"6c44baf29ed84225970b39a0e80b27f597e83f878d44440311ddad541b405c0d"} Dec 02 09:45:53 crc kubenswrapper[4781]: I1202 09:45:53.225412 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0b315a4b-f4f2-4066-a945-f168e5f76f49","Type":"ContainerStarted","Data":"6d86653b82b2c928643a0fa7db72afdc6b980f9fc09bcce930097858dacb94e4"} Dec 02 09:45:53 crc kubenswrapper[4781]: I1202 09:45:53.294555 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lqtxt"] Dec 02 09:45:53 crc kubenswrapper[4781]: W1202 09:45:53.300111 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4007cb57_06af_4119_bf9e_a2e2aa8509cd.slice/crio-37e2ce29ae4b30ec8644dbb833531a95ede00720d3361708dcfb876227092abe WatchSource:0}: Error finding container 37e2ce29ae4b30ec8644dbb833531a95ede00720d3361708dcfb876227092abe: Status 404 returned error can't find the container with id 37e2ce29ae4b30ec8644dbb833531a95ede00720d3361708dcfb876227092abe Dec 02 09:45:53 crc kubenswrapper[4781]: I1202 09:45:53.660567 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77d7d4d6c6-nc7g8"] Dec 02 09:45:53 crc kubenswrapper[4781]: W1202 09:45:53.668307 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b2440b7_d6a5_426e_ac38_2f5fd531935b.slice/crio-16ec681a6edfab8763b60f5c2859f4f379b359aa19021a312b58e8a9e5938c41 WatchSource:0}: Error finding container 16ec681a6edfab8763b60f5c2859f4f379b359aa19021a312b58e8a9e5938c41: Status 404 returned error can't find the container with id 16ec681a6edfab8763b60f5c2859f4f379b359aa19021a312b58e8a9e5938c41 Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.270670 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77d7d4d6c6-nc7g8" event={"ID":"1b2440b7-d6a5-426e-ac38-2f5fd531935b","Type":"ContainerStarted","Data":"82211f6afee1c1af66fd6f8198a3ea506b21910b79cc1625c9807ff23dac2d98"} Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.270982 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77d7d4d6c6-nc7g8" event={"ID":"1b2440b7-d6a5-426e-ac38-2f5fd531935b","Type":"ContainerStarted","Data":"16ec681a6edfab8763b60f5c2859f4f379b359aa19021a312b58e8a9e5938c41"} Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.278777 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.280471 4781 generic.go:334] "Generic (PLEG): container finished" podID="945f0634-f5c5-4471-abab-384d1c2a140c" containerID="dfd518dbd8a81595dd86b70d2ea5083d61f7da85873d40d99b771e9627aa115d" exitCode=0 Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.280679 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" event={"ID":"945f0634-f5c5-4471-abab-384d1c2a140c","Type":"ContainerDied","Data":"dfd518dbd8a81595dd86b70d2ea5083d61f7da85873d40d99b771e9627aa115d"} Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.293591 4781 generic.go:334] "Generic (PLEG): container finished" podID="4007cb57-06af-4119-bf9e-a2e2aa8509cd" containerID="2f87f7693723109ba430fcd9f33232231cceb212e84d31e5fc709dc6e794e5ae" exitCode=0 Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.293873 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" event={"ID":"4007cb57-06af-4119-bf9e-a2e2aa8509cd","Type":"ContainerDied","Data":"2f87f7693723109ba430fcd9f33232231cceb212e84d31e5fc709dc6e794e5ae"} Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.293933 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" event={"ID":"4007cb57-06af-4119-bf9e-a2e2aa8509cd","Type":"ContainerStarted","Data":"37e2ce29ae4b30ec8644dbb833531a95ede00720d3361708dcfb876227092abe"} Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.316615 4781 generic.go:334] "Generic (PLEG): container finished" podID="0b26b352-c249-411c-805f-8563bc99074a" containerID="93421a2cf45d7ea85c37d00b13d45c44fc7324840cd99750e14606b07ed73812" exitCode=0 Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.316898 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" event={"ID":"0b26b352-c249-411c-805f-8563bc99074a","Type":"ContainerDied","Data":"93421a2cf45d7ea85c37d00b13d45c44fc7324840cd99750e14606b07ed73812"} Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.384581 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b57b86b8-kf8n9" event={"ID":"22badd1d-b000-4870-86cd-1186fe2ed67d","Type":"ContainerStarted","Data":"ffba489bcf4c5dc5f768fc71cdf74d43834db93c6d3619c4e7b163a911b91cbe"} Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.384661 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.384678 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.384687 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b57b86b8-kf8n9" event={"ID":"22badd1d-b000-4870-86cd-1186fe2ed67d","Type":"ContainerStarted","Data":"a30683938bcc417eaf9b442f5163f4eddd5b0105f1c21bde059fbe53fc627c5e"} Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.384712 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0b315a4b-f4f2-4066-a945-f168e5f76f49","Type":"ContainerStarted","Data":"bce80c8ee8ecd873b5bb1cfe04026e3371558e5130da8c9be2f4e69c218047d0"} Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.427180 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56b57b86b8-kf8n9" podStartSLOduration=4.427153254 podStartE2EDuration="4.427153254s" podCreationTimestamp="2025-12-02 09:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:45:54.41572731 +0000 UTC m=+1517.239601209" watchObservedRunningTime="2025-12-02 09:45:54.427153254 +0000 UTC m=+1517.251027133" Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.869656 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.901619 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.963303 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-ovsdbserver-nb\") pod \"945f0634-f5c5-4471-abab-384d1c2a140c\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.963406 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrc4h\" (UniqueName: \"kubernetes.io/projected/945f0634-f5c5-4471-abab-384d1c2a140c-kube-api-access-rrc4h\") pod \"945f0634-f5c5-4471-abab-384d1c2a140c\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.963436 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-ovsdbserver-sb\") pod \"945f0634-f5c5-4471-abab-384d1c2a140c\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.963454 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-dns-svc\") pod \"945f0634-f5c5-4471-abab-384d1c2a140c\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.963544 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-config\") pod \"945f0634-f5c5-4471-abab-384d1c2a140c\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.963588 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-dns-swift-storage-0\") pod \"945f0634-f5c5-4471-abab-384d1c2a140c\" (UID: \"945f0634-f5c5-4471-abab-384d1c2a140c\") " Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.968255 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/945f0634-f5c5-4471-abab-384d1c2a140c-kube-api-access-rrc4h" (OuterVolumeSpecName: "kube-api-access-rrc4h") pod "945f0634-f5c5-4471-abab-384d1c2a140c" (UID: "945f0634-f5c5-4471-abab-384d1c2a140c"). InnerVolumeSpecName "kube-api-access-rrc4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.990428 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-config" (OuterVolumeSpecName: "config") pod "945f0634-f5c5-4471-abab-384d1c2a140c" (UID: "945f0634-f5c5-4471-abab-384d1c2a140c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:54 crc kubenswrapper[4781]: I1202 09:45:54.999794 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "945f0634-f5c5-4471-abab-384d1c2a140c" (UID: "945f0634-f5c5-4471-abab-384d1c2a140c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.013342 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "945f0634-f5c5-4471-abab-384d1c2a140c" (UID: "945f0634-f5c5-4471-abab-384d1c2a140c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.029676 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "945f0634-f5c5-4471-abab-384d1c2a140c" (UID: "945f0634-f5c5-4471-abab-384d1c2a140c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.040854 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "945f0634-f5c5-4471-abab-384d1c2a140c" (UID: "945f0634-f5c5-4471-abab-384d1c2a140c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.078897 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-ovsdbserver-nb\") pod \"0b26b352-c249-411c-805f-8563bc99074a\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.079200 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-dns-swift-storage-0\") pod \"0b26b352-c249-411c-805f-8563bc99074a\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.079243 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-ovsdbserver-sb\") pod \"0b26b352-c249-411c-805f-8563bc99074a\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.079389 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x659g\" (UniqueName: \"kubernetes.io/projected/0b26b352-c249-411c-805f-8563bc99074a-kube-api-access-x659g\") pod \"0b26b352-c249-411c-805f-8563bc99074a\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.079413 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-dns-svc\") pod \"0b26b352-c249-411c-805f-8563bc99074a\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.079455 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-config\") pod \"0b26b352-c249-411c-805f-8563bc99074a\" (UID: \"0b26b352-c249-411c-805f-8563bc99074a\") " Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.080299 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.080324 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrc4h\" (UniqueName: \"kubernetes.io/projected/945f0634-f5c5-4471-abab-384d1c2a140c-kube-api-access-rrc4h\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.080336 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.080345 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.080354 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.080364 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/945f0634-f5c5-4471-abab-384d1c2a140c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.087758 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b26b352-c249-411c-805f-8563bc99074a-kube-api-access-x659g" (OuterVolumeSpecName: "kube-api-access-x659g") pod "0b26b352-c249-411c-805f-8563bc99074a" (UID: "0b26b352-c249-411c-805f-8563bc99074a"). InnerVolumeSpecName "kube-api-access-x659g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.110433 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0b26b352-c249-411c-805f-8563bc99074a" (UID: "0b26b352-c249-411c-805f-8563bc99074a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.119357 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0b26b352-c249-411c-805f-8563bc99074a" (UID: "0b26b352-c249-411c-805f-8563bc99074a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.123492 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-config" (OuterVolumeSpecName: "config") pod "0b26b352-c249-411c-805f-8563bc99074a" (UID: "0b26b352-c249-411c-805f-8563bc99074a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.131152 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0b26b352-c249-411c-805f-8563bc99074a" (UID: "0b26b352-c249-411c-805f-8563bc99074a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.146537 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0b26b352-c249-411c-805f-8563bc99074a" (UID: "0b26b352-c249-411c-805f-8563bc99074a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.182080 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.182112 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.182121 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.182129 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x659g\" (UniqueName: \"kubernetes.io/projected/0b26b352-c249-411c-805f-8563bc99074a-kube-api-access-x659g\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.182140 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.182148 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b26b352-c249-411c-805f-8563bc99074a-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.404335 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77d7d4d6c6-nc7g8" event={"ID":"1b2440b7-d6a5-426e-ac38-2f5fd531935b","Type":"ContainerStarted","Data":"9053393ea9f95fe2fac898b4da31f97b12804d5409b12558ee2677ee81437aec"} Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.405454 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.408952 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.408966 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-p6vn9" event={"ID":"945f0634-f5c5-4471-abab-384d1c2a140c","Type":"ContainerDied","Data":"9212d7d8ff55ba929c2e769cabdfc04fd8a31b0941647d2b79da1caef8fbedc8"} Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.409008 4781 scope.go:117] "RemoveContainer" containerID="dfd518dbd8a81595dd86b70d2ea5083d61f7da85873d40d99b771e9627aa115d" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.420437 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" event={"ID":"4007cb57-06af-4119-bf9e-a2e2aa8509cd","Type":"ContainerStarted","Data":"f541f64d2be699a6fa47b38e1efc60aa3cc5fc6eeda0d14bf712ec07ee964e40"} Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.420750 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.424291 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.425177 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-ddg7b" event={"ID":"0b26b352-c249-411c-805f-8563bc99074a","Type":"ContainerDied","Data":"80e252f45073e012320cb6f6c16aa8413825ddb931b440b908cbe8913d0b01eb"} Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.428451 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-77d7d4d6c6-nc7g8" podStartSLOduration=3.428432808 podStartE2EDuration="3.428432808s" podCreationTimestamp="2025-12-02 09:45:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:45:55.422766797 +0000 UTC m=+1518.246640676" watchObservedRunningTime="2025-12-02 09:45:55.428432808 +0000 UTC m=+1518.252306687" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.442817 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0b315a4b-f4f2-4066-a945-f168e5f76f49" containerName="cinder-api-log" containerID="cri-o://bce80c8ee8ecd873b5bb1cfe04026e3371558e5130da8c9be2f4e69c218047d0" gracePeriod=30 Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.443083 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0b315a4b-f4f2-4066-a945-f168e5f76f49","Type":"ContainerStarted","Data":"f80180dae2516740b30eb1766347514f955f8f0a5ea9546d1800e493d1735c9b"} Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.443131 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.443158 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0b315a4b-f4f2-4066-a945-f168e5f76f49" containerName="cinder-api" containerID="cri-o://f80180dae2516740b30eb1766347514f955f8f0a5ea9546d1800e493d1735c9b" gracePeriod=30 Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.453963 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" podStartSLOduration=3.453940626 podStartE2EDuration="3.453940626s" podCreationTimestamp="2025-12-02 09:45:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:45:55.453552226 +0000 UTC m=+1518.277426105" watchObservedRunningTime="2025-12-02 09:45:55.453940626 +0000 UTC m=+1518.277814505" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.506735 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.506718599 podStartE2EDuration="4.506718599s" podCreationTimestamp="2025-12-02 09:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:45:55.485286099 +0000 UTC m=+1518.309159988" watchObservedRunningTime="2025-12-02 09:45:55.506718599 +0000 UTC m=+1518.330592478" Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.588010 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-ddg7b"] Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.602326 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-ddg7b"] Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.661995 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-p6vn9"] Dec 02 09:45:55 crc kubenswrapper[4781]: I1202 09:45:55.674393 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-p6vn9"] Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.345704 4781 scope.go:117] "RemoveContainer" containerID="93421a2cf45d7ea85c37d00b13d45c44fc7324840cd99750e14606b07ed73812" Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.463232 4781 generic.go:334] "Generic (PLEG): container finished" podID="0b315a4b-f4f2-4066-a945-f168e5f76f49" containerID="f80180dae2516740b30eb1766347514f955f8f0a5ea9546d1800e493d1735c9b" exitCode=0 Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.463287 4781 generic.go:334] "Generic (PLEG): container finished" podID="0b315a4b-f4f2-4066-a945-f168e5f76f49" containerID="bce80c8ee8ecd873b5bb1cfe04026e3371558e5130da8c9be2f4e69c218047d0" exitCode=143 Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.463363 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0b315a4b-f4f2-4066-a945-f168e5f76f49","Type":"ContainerDied","Data":"f80180dae2516740b30eb1766347514f955f8f0a5ea9546d1800e493d1735c9b"} Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.463391 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0b315a4b-f4f2-4066-a945-f168e5f76f49","Type":"ContainerDied","Data":"bce80c8ee8ecd873b5bb1cfe04026e3371558e5130da8c9be2f4e69c218047d0"} Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.477942 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2927c2fe-9023-4be6-a44a-6701f24c499a","Type":"ContainerStarted","Data":"b96f5c817e9dd75d8e2b5ac9940d95cebe28fb7655e48cf280ad05627084a444"} Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.742027 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.931751 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b315a4b-f4f2-4066-a945-f168e5f76f49-etc-machine-id\") pod \"0b315a4b-f4f2-4066-a945-f168e5f76f49\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.932075 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fqsl\" (UniqueName: \"kubernetes.io/projected/0b315a4b-f4f2-4066-a945-f168e5f76f49-kube-api-access-6fqsl\") pod \"0b315a4b-f4f2-4066-a945-f168e5f76f49\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.932103 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-config-data\") pod \"0b315a4b-f4f2-4066-a945-f168e5f76f49\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.932220 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-config-data-custom\") pod \"0b315a4b-f4f2-4066-a945-f168e5f76f49\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.932339 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-combined-ca-bundle\") pod \"0b315a4b-f4f2-4066-a945-f168e5f76f49\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.932368 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b315a4b-f4f2-4066-a945-f168e5f76f49-logs\") pod \"0b315a4b-f4f2-4066-a945-f168e5f76f49\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.932465 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-scripts\") pod \"0b315a4b-f4f2-4066-a945-f168e5f76f49\" (UID: \"0b315a4b-f4f2-4066-a945-f168e5f76f49\") " Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.933234 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b315a4b-f4f2-4066-a945-f168e5f76f49-logs" (OuterVolumeSpecName: "logs") pod "0b315a4b-f4f2-4066-a945-f168e5f76f49" (UID: "0b315a4b-f4f2-4066-a945-f168e5f76f49"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.933538 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b315a4b-f4f2-4066-a945-f168e5f76f49-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0b315a4b-f4f2-4066-a945-f168e5f76f49" (UID: "0b315a4b-f4f2-4066-a945-f168e5f76f49"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.940073 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0b315a4b-f4f2-4066-a945-f168e5f76f49" (UID: "0b315a4b-f4f2-4066-a945-f168e5f76f49"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.942652 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-scripts" (OuterVolumeSpecName: "scripts") pod "0b315a4b-f4f2-4066-a945-f168e5f76f49" (UID: "0b315a4b-f4f2-4066-a945-f168e5f76f49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.943122 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b315a4b-f4f2-4066-a945-f168e5f76f49-kube-api-access-6fqsl" (OuterVolumeSpecName: "kube-api-access-6fqsl") pod "0b315a4b-f4f2-4066-a945-f168e5f76f49" (UID: "0b315a4b-f4f2-4066-a945-f168e5f76f49"). InnerVolumeSpecName "kube-api-access-6fqsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:45:56 crc kubenswrapper[4781]: I1202 09:45:56.979408 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b315a4b-f4f2-4066-a945-f168e5f76f49" (UID: "0b315a4b-f4f2-4066-a945-f168e5f76f49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.008721 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-config-data" (OuterVolumeSpecName: "config-data") pod "0b315a4b-f4f2-4066-a945-f168e5f76f49" (UID: "0b315a4b-f4f2-4066-a945-f168e5f76f49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.035513 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.035550 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b315a4b-f4f2-4066-a945-f168e5f76f49-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.035560 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.035570 4781 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b315a4b-f4f2-4066-a945-f168e5f76f49-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.035580 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fqsl\" (UniqueName: \"kubernetes.io/projected/0b315a4b-f4f2-4066-a945-f168e5f76f49-kube-api-access-6fqsl\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.035591 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.035599 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b315a4b-f4f2-4066-a945-f168e5f76f49-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.227231 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.227624 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="269490ed-f426-4577-b60d-695881e29aac" containerName="glance-log" containerID="cri-o://e0960176bdda8e2ba82d90641ba566c3b6c5c7a6ec61704be70fbd7a1daf1d81" gracePeriod=30 Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.227827 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="269490ed-f426-4577-b60d-695881e29aac" containerName="glance-httpd" containerID="cri-o://a974bce8f6f1af663c342580c805ddbaa6097b40912021df63435e438216c135" gracePeriod=30 Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.499000 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-6q5kh"] Dec 02 09:45:57 crc kubenswrapper[4781]: E1202 09:45:57.500631 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="945f0634-f5c5-4471-abab-384d1c2a140c" containerName="init" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.500740 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="945f0634-f5c5-4471-abab-384d1c2a140c" containerName="init" Dec 02 09:45:57 crc kubenswrapper[4781]: E1202 09:45:57.500838 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b315a4b-f4f2-4066-a945-f168e5f76f49" containerName="cinder-api" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.500916 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b315a4b-f4f2-4066-a945-f168e5f76f49" containerName="cinder-api" Dec 02 09:45:57 crc kubenswrapper[4781]: E1202 09:45:57.501090 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b315a4b-f4f2-4066-a945-f168e5f76f49" containerName="cinder-api-log" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.501192 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b315a4b-f4f2-4066-a945-f168e5f76f49" containerName="cinder-api-log" Dec 02 09:45:57 crc kubenswrapper[4781]: E1202 09:45:57.501288 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b26b352-c249-411c-805f-8563bc99074a" containerName="init" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.501360 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b26b352-c249-411c-805f-8563bc99074a" containerName="init" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.501674 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b315a4b-f4f2-4066-a945-f168e5f76f49" containerName="cinder-api-log" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.501766 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b315a4b-f4f2-4066-a945-f168e5f76f49" containerName="cinder-api" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.501855 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b26b352-c249-411c-805f-8563bc99074a" containerName="init" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.501957 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="945f0634-f5c5-4471-abab-384d1c2a140c" containerName="init" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.503280 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6q5kh" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.525356 4781 generic.go:334] "Generic (PLEG): container finished" podID="269490ed-f426-4577-b60d-695881e29aac" containerID="e0960176bdda8e2ba82d90641ba566c3b6c5c7a6ec61704be70fbd7a1daf1d81" exitCode=143 Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.588596 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b26b352-c249-411c-805f-8563bc99074a" path="/var/lib/kubelet/pods/0b26b352-c249-411c-805f-8563bc99074a/volumes" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.596564 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="945f0634-f5c5-4471-abab-384d1c2a140c" path="/var/lib/kubelet/pods/945f0634-f5c5-4471-abab-384d1c2a140c/volumes" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.597312 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"269490ed-f426-4577-b60d-695881e29aac","Type":"ContainerDied","Data":"e0960176bdda8e2ba82d90641ba566c3b6c5c7a6ec61704be70fbd7a1daf1d81"} Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.606065 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2927c2fe-9023-4be6-a44a-6701f24c499a","Type":"ContainerStarted","Data":"271eaafcaef3b1e096ae33b029d7ad759e5eca9f1a9a61fd5beeda21269a5651"} Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.606133 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6q5kh"] Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.606157 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7a424309-12e4-42f9-ba35-d61f1f6c7b44","Type":"ContainerStarted","Data":"0e4650d083955466d18b626e4be3c0ac3e90df298ae06bd450e7bb60ffd182af"} Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.606169 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65bbfbf867-4wxpf" event={"ID":"bac97a41-a2f4-46ee-b48c-216aeee03abc","Type":"ContainerStarted","Data":"6f506b51202f6ca6d497cdfb7c233a111612ba10e1cf91d209dc29fa5548cf1a"} Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.606181 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65bbfbf867-4wxpf" event={"ID":"bac97a41-a2f4-46ee-b48c-216aeee03abc","Type":"ContainerStarted","Data":"4c7b0fa500d297a53b8e565892f48c0e0baea9d68d1732d019d8af0d7f4548ee"} Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.613587 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.613891 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0b315a4b-f4f2-4066-a945-f168e5f76f49","Type":"ContainerDied","Data":"6d86653b82b2c928643a0fa7db72afdc6b980f9fc09bcce930097858dacb94e4"} Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.614140 4781 scope.go:117] "RemoveContainer" containerID="f80180dae2516740b30eb1766347514f955f8f0a5ea9546d1800e493d1735c9b" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.616109 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-dd8rn"] Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.618847 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dd8rn" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.647701 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" event={"ID":"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1","Type":"ContainerStarted","Data":"fd8fb06e13b36a35d2dd4d45e07817f87086555f70635737c70229909ea855c9"} Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.648102 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" event={"ID":"f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1","Type":"ContainerStarted","Data":"4d112655da2a2cbffdd766bc30cad1c1df32263e497e2a70293b80cc93241302"} Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.653853 4781 scope.go:117] "RemoveContainer" containerID="bce80c8ee8ecd873b5bb1cfe04026e3371558e5130da8c9be2f4e69c218047d0" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.654799 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6140504f-30c8-4ec8-9659-fc4f2795c6a2-operator-scripts\") pod \"nova-api-db-create-6q5kh\" (UID: \"6140504f-30c8-4ec8-9659-fc4f2795c6a2\") " pod="openstack/nova-api-db-create-6q5kh" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.655202 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkl9w\" (UniqueName: \"kubernetes.io/projected/6140504f-30c8-4ec8-9659-fc4f2795c6a2-kube-api-access-hkl9w\") pod \"nova-api-db-create-6q5kh\" (UID: \"6140504f-30c8-4ec8-9659-fc4f2795c6a2\") " pod="openstack/nova-api-db-create-6q5kh" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.681003 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-8af6-account-create-update-8c6xl"] Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.682236 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8af6-account-create-update-8c6xl" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.684621 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.725587 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dd8rn"] Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.746733 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8af6-account-create-update-8c6xl"] Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.764864 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkl9w\" (UniqueName: \"kubernetes.io/projected/6140504f-30c8-4ec8-9659-fc4f2795c6a2-kube-api-access-hkl9w\") pod \"nova-api-db-create-6q5kh\" (UID: \"6140504f-30c8-4ec8-9659-fc4f2795c6a2\") " pod="openstack/nova-api-db-create-6q5kh" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.765038 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2718c51-467e-407a-99b1-266956fcacfa-operator-scripts\") pod \"nova-cell0-db-create-dd8rn\" (UID: \"e2718c51-467e-407a-99b1-266956fcacfa\") " pod="openstack/nova-cell0-db-create-dd8rn" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.765152 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqn2x\" (UniqueName: \"kubernetes.io/projected/e2718c51-467e-407a-99b1-266956fcacfa-kube-api-access-wqn2x\") pod \"nova-cell0-db-create-dd8rn\" (UID: \"e2718c51-467e-407a-99b1-266956fcacfa\") " pod="openstack/nova-cell0-db-create-dd8rn" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.765173 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6140504f-30c8-4ec8-9659-fc4f2795c6a2-operator-scripts\") pod \"nova-api-db-create-6q5kh\" (UID: \"6140504f-30c8-4ec8-9659-fc4f2795c6a2\") " pod="openstack/nova-api-db-create-6q5kh" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.767792 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6140504f-30c8-4ec8-9659-fc4f2795c6a2-operator-scripts\") pod \"nova-api-db-create-6q5kh\" (UID: \"6140504f-30c8-4ec8-9659-fc4f2795c6a2\") " pod="openstack/nova-api-db-create-6q5kh" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.791969 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.252820013 podStartE2EDuration="37.791947114s" podCreationTimestamp="2025-12-02 09:45:20 +0000 UTC" firstStartedPulling="2025-12-02 09:45:20.938975366 +0000 UTC m=+1483.762849245" lastFinishedPulling="2025-12-02 09:45:56.478102467 +0000 UTC m=+1519.301976346" observedRunningTime="2025-12-02 09:45:57.67300408 +0000 UTC m=+1520.496877979" watchObservedRunningTime="2025-12-02 09:45:57.791947114 +0000 UTC m=+1520.615820993" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.792031 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkl9w\" (UniqueName: \"kubernetes.io/projected/6140504f-30c8-4ec8-9659-fc4f2795c6a2-kube-api-access-hkl9w\") pod \"nova-api-db-create-6q5kh\" (UID: \"6140504f-30c8-4ec8-9659-fc4f2795c6a2\") " pod="openstack/nova-api-db-create-6q5kh" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.804216 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5d776c757d-qgkmw" podStartSLOduration=2.989637909 podStartE2EDuration="7.804195579s" podCreationTimestamp="2025-12-02 09:45:50 +0000 UTC" firstStartedPulling="2025-12-02 09:45:51.545671151 +0000 UTC m=+1514.369545020" lastFinishedPulling="2025-12-02 09:45:56.360228811 +0000 UTC m=+1519.184102690" observedRunningTime="2025-12-02 09:45:57.701309762 +0000 UTC m=+1520.525183641" watchObservedRunningTime="2025-12-02 09:45:57.804195579 +0000 UTC m=+1520.628069458" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.831717 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6q5kh" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.851451 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-65bbfbf867-4wxpf" podStartSLOduration=3.029808737 podStartE2EDuration="7.851426255s" podCreationTimestamp="2025-12-02 09:45:50 +0000 UTC" firstStartedPulling="2025-12-02 09:45:51.540284378 +0000 UTC m=+1514.364158257" lastFinishedPulling="2025-12-02 09:45:56.361901896 +0000 UTC m=+1519.185775775" observedRunningTime="2025-12-02 09:45:57.741254534 +0000 UTC m=+1520.565128413" watchObservedRunningTime="2025-12-02 09:45:57.851426255 +0000 UTC m=+1520.675300144" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.852734 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.7552507 podStartE2EDuration="6.85272494s" podCreationTimestamp="2025-12-02 09:45:51 +0000 UTC" firstStartedPulling="2025-12-02 09:45:52.589609358 +0000 UTC m=+1515.413483227" lastFinishedPulling="2025-12-02 09:45:54.687083588 +0000 UTC m=+1517.510957467" observedRunningTime="2025-12-02 09:45:57.76626214 +0000 UTC m=+1520.590136019" watchObservedRunningTime="2025-12-02 09:45:57.85272494 +0000 UTC m=+1520.676598819" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.866332 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2718c51-467e-407a-99b1-266956fcacfa-operator-scripts\") pod \"nova-cell0-db-create-dd8rn\" (UID: \"e2718c51-467e-407a-99b1-266956fcacfa\") " pod="openstack/nova-cell0-db-create-dd8rn" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.866375 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq959\" (UniqueName: \"kubernetes.io/projected/94a18aa9-0687-4427-942c-d343a3f0c5f4-kube-api-access-pq959\") pod \"nova-api-8af6-account-create-update-8c6xl\" (UID: \"94a18aa9-0687-4427-942c-d343a3f0c5f4\") " pod="openstack/nova-api-8af6-account-create-update-8c6xl" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.866462 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqn2x\" (UniqueName: \"kubernetes.io/projected/e2718c51-467e-407a-99b1-266956fcacfa-kube-api-access-wqn2x\") pod \"nova-cell0-db-create-dd8rn\" (UID: \"e2718c51-467e-407a-99b1-266956fcacfa\") " pod="openstack/nova-cell0-db-create-dd8rn" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.866548 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94a18aa9-0687-4427-942c-d343a3f0c5f4-operator-scripts\") pod \"nova-api-8af6-account-create-update-8c6xl\" (UID: \"94a18aa9-0687-4427-942c-d343a3f0c5f4\") " pod="openstack/nova-api-8af6-account-create-update-8c6xl" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.867333 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2718c51-467e-407a-99b1-266956fcacfa-operator-scripts\") pod \"nova-cell0-db-create-dd8rn\" (UID: \"e2718c51-467e-407a-99b1-266956fcacfa\") " pod="openstack/nova-cell0-db-create-dd8rn" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.899300 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqn2x\" (UniqueName: \"kubernetes.io/projected/e2718c51-467e-407a-99b1-266956fcacfa-kube-api-access-wqn2x\") pod \"nova-cell0-db-create-dd8rn\" (UID: \"e2718c51-467e-407a-99b1-266956fcacfa\") " pod="openstack/nova-cell0-db-create-dd8rn" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.971076 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq959\" (UniqueName: \"kubernetes.io/projected/94a18aa9-0687-4427-942c-d343a3f0c5f4-kube-api-access-pq959\") pod \"nova-api-8af6-account-create-update-8c6xl\" (UID: \"94a18aa9-0687-4427-942c-d343a3f0c5f4\") " pod="openstack/nova-api-8af6-account-create-update-8c6xl" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.971247 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94a18aa9-0687-4427-942c-d343a3f0c5f4-operator-scripts\") pod \"nova-api-8af6-account-create-update-8c6xl\" (UID: \"94a18aa9-0687-4427-942c-d343a3f0c5f4\") " pod="openstack/nova-api-8af6-account-create-update-8c6xl" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.971908 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94a18aa9-0687-4427-942c-d343a3f0c5f4-operator-scripts\") pod \"nova-api-8af6-account-create-update-8c6xl\" (UID: \"94a18aa9-0687-4427-942c-d343a3f0c5f4\") " pod="openstack/nova-api-8af6-account-create-update-8c6xl" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.974140 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dd8rn" Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.974644 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.988421 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-7h6r2"] Dec 02 09:45:57 crc kubenswrapper[4781]: I1202 09:45:57.989636 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7h6r2" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.002107 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.019996 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-1694-account-create-update-9dbhz"] Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.022555 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1694-account-create-update-9dbhz" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.030113 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.033656 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7h6r2"] Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.034984 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq959\" (UniqueName: \"kubernetes.io/projected/94a18aa9-0687-4427-942c-d343a3f0c5f4-kube-api-access-pq959\") pod \"nova-api-8af6-account-create-update-8c6xl\" (UID: \"94a18aa9-0687-4427-942c-d343a3f0c5f4\") " pod="openstack/nova-api-8af6-account-create-update-8c6xl" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.045199 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1694-account-create-update-9dbhz"] Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.093093 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.096030 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.102426 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.102544 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.102766 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.132123 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.150980 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-5a93-account-create-update-48m59"] Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.175132 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5a93-account-create-update-48m59" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.183247 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.189551 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1bda83d-ff07-4b0c-a978-a22373590d0f-operator-scripts\") pod \"nova-cell0-1694-account-create-update-9dbhz\" (UID: \"a1bda83d-ff07-4b0c-a978-a22373590d0f\") " pod="openstack/nova-cell0-1694-account-create-update-9dbhz" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.189728 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a89db488-2448-4a65-82ee-08472553ad38-operator-scripts\") pod \"nova-cell1-db-create-7h6r2\" (UID: \"a89db488-2448-4a65-82ee-08472553ad38\") " pod="openstack/nova-cell1-db-create-7h6r2" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.189766 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2dlb\" (UniqueName: \"kubernetes.io/projected/a89db488-2448-4a65-82ee-08472553ad38-kube-api-access-q2dlb\") pod \"nova-cell1-db-create-7h6r2\" (UID: \"a89db488-2448-4a65-82ee-08472553ad38\") " pod="openstack/nova-cell1-db-create-7h6r2" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.189936 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nprzk\" (UniqueName: \"kubernetes.io/projected/a1bda83d-ff07-4b0c-a978-a22373590d0f-kube-api-access-nprzk\") pod \"nova-cell0-1694-account-create-update-9dbhz\" (UID: \"a1bda83d-ff07-4b0c-a978-a22373590d0f\") " pod="openstack/nova-cell0-1694-account-create-update-9dbhz" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.222868 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5a93-account-create-update-48m59"] Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.296662 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.296753 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1bda83d-ff07-4b0c-a978-a22373590d0f-operator-scripts\") pod \"nova-cell0-1694-account-create-update-9dbhz\" (UID: \"a1bda83d-ff07-4b0c-a978-a22373590d0f\") " pod="openstack/nova-cell0-1694-account-create-update-9dbhz" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.296834 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-config-data-custom\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.296964 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-config-data\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.297056 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a89db488-2448-4a65-82ee-08472553ad38-operator-scripts\") pod \"nova-cell1-db-create-7h6r2\" (UID: \"a89db488-2448-4a65-82ee-08472553ad38\") " pod="openstack/nova-cell1-db-create-7h6r2" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.297119 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2dlb\" (UniqueName: \"kubernetes.io/projected/a89db488-2448-4a65-82ee-08472553ad38-kube-api-access-q2dlb\") pod \"nova-cell1-db-create-7h6r2\" (UID: \"a89db488-2448-4a65-82ee-08472553ad38\") " pod="openstack/nova-cell1-db-create-7h6r2" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.297142 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.297206 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e2d3890-4772-476e-9850-fdb32111b87a-logs\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.297224 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqzqm\" (UniqueName: \"kubernetes.io/projected/4e2d3890-4772-476e-9850-fdb32111b87a-kube-api-access-kqzqm\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.297249 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.297281 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-scripts\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.297302 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9416e5ad-d0d8-404a-8fe8-8a5648030542-operator-scripts\") pod \"nova-cell1-5a93-account-create-update-48m59\" (UID: \"9416e5ad-d0d8-404a-8fe8-8a5648030542\") " pod="openstack/nova-cell1-5a93-account-create-update-48m59" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.297319 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e2d3890-4772-476e-9850-fdb32111b87a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.297419 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2r48\" (UniqueName: \"kubernetes.io/projected/9416e5ad-d0d8-404a-8fe8-8a5648030542-kube-api-access-m2r48\") pod \"nova-cell1-5a93-account-create-update-48m59\" (UID: \"9416e5ad-d0d8-404a-8fe8-8a5648030542\") " pod="openstack/nova-cell1-5a93-account-create-update-48m59" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.297445 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nprzk\" (UniqueName: \"kubernetes.io/projected/a1bda83d-ff07-4b0c-a978-a22373590d0f-kube-api-access-nprzk\") pod \"nova-cell0-1694-account-create-update-9dbhz\" (UID: \"a1bda83d-ff07-4b0c-a978-a22373590d0f\") " pod="openstack/nova-cell0-1694-account-create-update-9dbhz" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.297580 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1bda83d-ff07-4b0c-a978-a22373590d0f-operator-scripts\") pod \"nova-cell0-1694-account-create-update-9dbhz\" (UID: \"a1bda83d-ff07-4b0c-a978-a22373590d0f\") " pod="openstack/nova-cell0-1694-account-create-update-9dbhz" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.297966 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a89db488-2448-4a65-82ee-08472553ad38-operator-scripts\") pod \"nova-cell1-db-create-7h6r2\" (UID: \"a89db488-2448-4a65-82ee-08472553ad38\") " pod="openstack/nova-cell1-db-create-7h6r2" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.313853 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2dlb\" (UniqueName: \"kubernetes.io/projected/a89db488-2448-4a65-82ee-08472553ad38-kube-api-access-q2dlb\") pod \"nova-cell1-db-create-7h6r2\" (UID: \"a89db488-2448-4a65-82ee-08472553ad38\") " pod="openstack/nova-cell1-db-create-7h6r2" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.316337 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nprzk\" (UniqueName: \"kubernetes.io/projected/a1bda83d-ff07-4b0c-a978-a22373590d0f-kube-api-access-nprzk\") pod \"nova-cell0-1694-account-create-update-9dbhz\" (UID: \"a1bda83d-ff07-4b0c-a978-a22373590d0f\") " pod="openstack/nova-cell0-1694-account-create-update-9dbhz" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.324828 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8af6-account-create-update-8c6xl" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.399255 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2r48\" (UniqueName: \"kubernetes.io/projected/9416e5ad-d0d8-404a-8fe8-8a5648030542-kube-api-access-m2r48\") pod \"nova-cell1-5a93-account-create-update-48m59\" (UID: \"9416e5ad-d0d8-404a-8fe8-8a5648030542\") " pod="openstack/nova-cell1-5a93-account-create-update-48m59" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.399315 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.399362 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-config-data-custom\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.399395 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-config-data\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.399428 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.399453 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e2d3890-4772-476e-9850-fdb32111b87a-logs\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.399468 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqzqm\" (UniqueName: \"kubernetes.io/projected/4e2d3890-4772-476e-9850-fdb32111b87a-kube-api-access-kqzqm\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.399484 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.399503 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-scripts\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.399519 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9416e5ad-d0d8-404a-8fe8-8a5648030542-operator-scripts\") pod \"nova-cell1-5a93-account-create-update-48m59\" (UID: \"9416e5ad-d0d8-404a-8fe8-8a5648030542\") " pod="openstack/nova-cell1-5a93-account-create-update-48m59" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.399534 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e2d3890-4772-476e-9850-fdb32111b87a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.399612 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e2d3890-4772-476e-9850-fdb32111b87a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.400468 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e2d3890-4772-476e-9850-fdb32111b87a-logs\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.403287 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.403624 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9416e5ad-d0d8-404a-8fe8-8a5648030542-operator-scripts\") pod \"nova-cell1-5a93-account-create-update-48m59\" (UID: \"9416e5ad-d0d8-404a-8fe8-8a5648030542\") " pod="openstack/nova-cell1-5a93-account-create-update-48m59" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.405528 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.404973 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.406706 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-scripts\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.407171 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-config-data\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.407355 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e2d3890-4772-476e-9850-fdb32111b87a-config-data-custom\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.407655 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7h6r2" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.422064 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2r48\" (UniqueName: \"kubernetes.io/projected/9416e5ad-d0d8-404a-8fe8-8a5648030542-kube-api-access-m2r48\") pod \"nova-cell1-5a93-account-create-update-48m59\" (UID: \"9416e5ad-d0d8-404a-8fe8-8a5648030542\") " pod="openstack/nova-cell1-5a93-account-create-update-48m59" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.424654 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqzqm\" (UniqueName: \"kubernetes.io/projected/4e2d3890-4772-476e-9850-fdb32111b87a-kube-api-access-kqzqm\") pod \"cinder-api-0\" (UID: \"4e2d3890-4772-476e-9850-fdb32111b87a\") " pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.430405 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1694-account-create-update-9dbhz" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.445543 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.473914 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6q5kh"] Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.538778 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9cbdc4d89-pkh64"] Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.544310 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.548017 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.549116 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.556340 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9cbdc4d89-pkh64"] Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.564110 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5a93-account-create-update-48m59" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.662737 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6q5kh" event={"ID":"6140504f-30c8-4ec8-9659-fc4f2795c6a2","Type":"ContainerStarted","Data":"eacdec5868e09b055c3fb5495fdbc212386a23e358266720b94e90ba73ce7d23"} Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.704981 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-internal-tls-certs\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.705024 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94ggh\" (UniqueName: \"kubernetes.io/projected/f074040a-1272-492e-b149-3a0a6cc89efd-kube-api-access-94ggh\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.705121 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-httpd-config\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.705145 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-combined-ca-bundle\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.705163 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-ovndb-tls-certs\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.705219 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-public-tls-certs\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.705253 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-config\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.806733 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-public-tls-certs\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.808044 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-config\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.808130 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-internal-tls-certs\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.808171 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94ggh\" (UniqueName: \"kubernetes.io/projected/f074040a-1272-492e-b149-3a0a6cc89efd-kube-api-access-94ggh\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.808330 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-httpd-config\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.808346 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-combined-ca-bundle\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.808361 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-ovndb-tls-certs\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.813808 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-config\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.820086 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-internal-tls-certs\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.821606 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-combined-ca-bundle\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.821783 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-public-tls-certs\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.822254 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-httpd-config\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.823488 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f074040a-1272-492e-b149-3a0a6cc89efd-ovndb-tls-certs\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:58 crc kubenswrapper[4781]: I1202 09:45:58.835656 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94ggh\" (UniqueName: \"kubernetes.io/projected/f074040a-1272-492e-b149-3a0a6cc89efd-kube-api-access-94ggh\") pod \"neutron-9cbdc4d89-pkh64\" (UID: \"f074040a-1272-492e-b149-3a0a6cc89efd\") " pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:59 crc kubenswrapper[4781]: I1202 09:45:59.012623 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:45:59 crc kubenswrapper[4781]: I1202 09:45:59.200230 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dd8rn"] Dec 02 09:45:59 crc kubenswrapper[4781]: W1202 09:45:59.209484 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2718c51_467e_407a_99b1_266956fcacfa.slice/crio-40d8da9ad00f4baa9b2e9ea35e1afdf1faa774e31bea74e9a6ac56dbc832733b WatchSource:0}: Error finding container 40d8da9ad00f4baa9b2e9ea35e1afdf1faa774e31bea74e9a6ac56dbc832733b: Status 404 returned error can't find the container with id 40d8da9ad00f4baa9b2e9ea35e1afdf1faa774e31bea74e9a6ac56dbc832733b Dec 02 09:45:59 crc kubenswrapper[4781]: I1202 09:45:59.539250 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b315a4b-f4f2-4066-a945-f168e5f76f49" path="/var/lib/kubelet/pods/0b315a4b-f4f2-4066-a945-f168e5f76f49/volumes" Dec 02 09:45:59 crc kubenswrapper[4781]: I1202 09:45:59.558563 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:45:59 crc kubenswrapper[4781]: I1202 09:45:59.559173 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8a44acc8-6e3c-4643-9b69-34f54b53c188" containerName="glance-log" containerID="cri-o://2626694bb2556bccbe5290d2d9b12d69989c1d612b985be1232da4748cb26470" gracePeriod=30 Dec 02 09:45:59 crc kubenswrapper[4781]: I1202 09:45:59.559702 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8a44acc8-6e3c-4643-9b69-34f54b53c188" containerName="glance-httpd" containerID="cri-o://3ccee59f51f4b55106b87ba1b23d1a23f762a2b56edb990a2065c1a6fc1bcfdb" gracePeriod=30 Dec 02 09:45:59 crc kubenswrapper[4781]: I1202 09:45:59.706246 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8af6-account-create-update-8c6xl"] Dec 02 09:45:59 crc kubenswrapper[4781]: I1202 09:45:59.716400 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dd8rn" event={"ID":"e2718c51-467e-407a-99b1-266956fcacfa","Type":"ContainerStarted","Data":"31948cc1c7fdc3245b056d8db87415ad886b621d820d68ecbbf81b0725c2cd37"} Dec 02 09:45:59 crc kubenswrapper[4781]: I1202 09:45:59.716437 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dd8rn" event={"ID":"e2718c51-467e-407a-99b1-266956fcacfa","Type":"ContainerStarted","Data":"40d8da9ad00f4baa9b2e9ea35e1afdf1faa774e31bea74e9a6ac56dbc832733b"} Dec 02 09:45:59 crc kubenswrapper[4781]: I1202 09:45:59.728527 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5a93-account-create-update-48m59"] Dec 02 09:45:59 crc kubenswrapper[4781]: I1202 09:45:59.767810 4781 generic.go:334] "Generic (PLEG): container finished" podID="6140504f-30c8-4ec8-9659-fc4f2795c6a2" containerID="af8d6eab68d6b452cb1be0960bab4eb06d889f44fddc6f4cfbb8650f6a84a808" exitCode=0 Dec 02 09:45:59 crc kubenswrapper[4781]: I1202 09:45:59.769291 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6q5kh" event={"ID":"6140504f-30c8-4ec8-9659-fc4f2795c6a2","Type":"ContainerDied","Data":"af8d6eab68d6b452cb1be0960bab4eb06d889f44fddc6f4cfbb8650f6a84a808"} Dec 02 09:45:59 crc kubenswrapper[4781]: I1202 09:45:59.773693 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 09:45:59 crc kubenswrapper[4781]: I1202 09:45:59.798650 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1694-account-create-update-9dbhz"] Dec 02 09:45:59 crc kubenswrapper[4781]: I1202 09:45:59.859008 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7h6r2"] Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.052519 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9cbdc4d89-pkh64"] Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.708194 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zh57t"] Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.710554 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zh57t" Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.734962 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zh57t"] Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.798968 4781 generic.go:334] "Generic (PLEG): container finished" podID="269490ed-f426-4577-b60d-695881e29aac" containerID="a974bce8f6f1af663c342580c805ddbaa6097b40912021df63435e438216c135" exitCode=0 Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.799019 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"269490ed-f426-4577-b60d-695881e29aac","Type":"ContainerDied","Data":"a974bce8f6f1af663c342580c805ddbaa6097b40912021df63435e438216c135"} Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.810766 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9cbdc4d89-pkh64" event={"ID":"f074040a-1272-492e-b149-3a0a6cc89efd","Type":"ContainerStarted","Data":"22da6113d3bab7d55bc9cb2b49a1dcd063f2c8952383b16ccba516f459b7a8f9"} Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.810811 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9cbdc4d89-pkh64" event={"ID":"f074040a-1272-492e-b149-3a0a6cc89efd","Type":"ContainerStarted","Data":"a919b84cc7f089f409b928d4e2e41de25911879d40a329113a979c5b7a231340"} Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.825780 4781 generic.go:334] "Generic (PLEG): container finished" podID="9416e5ad-d0d8-404a-8fe8-8a5648030542" containerID="e8726982453fea62e8096c6eef8a1e6229c53551f4c77463f1662e601f13f38e" exitCode=0 Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.825875 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5a93-account-create-update-48m59" event={"ID":"9416e5ad-d0d8-404a-8fe8-8a5648030542","Type":"ContainerDied","Data":"e8726982453fea62e8096c6eef8a1e6229c53551f4c77463f1662e601f13f38e"} Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.825900 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5a93-account-create-update-48m59" event={"ID":"9416e5ad-d0d8-404a-8fe8-8a5648030542","Type":"ContainerStarted","Data":"b0b4cc38f09ef881f05c474352969ce611aaec046b43ea68a31d9eef8d73369f"} Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.828345 4781 generic.go:334] "Generic (PLEG): container finished" podID="a89db488-2448-4a65-82ee-08472553ad38" containerID="0c34967ec78eebb883230e5b26ffff48d377040317caae0756d0210120bfa3df" exitCode=0 Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.828402 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7h6r2" event={"ID":"a89db488-2448-4a65-82ee-08472553ad38","Type":"ContainerDied","Data":"0c34967ec78eebb883230e5b26ffff48d377040317caae0756d0210120bfa3df"} Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.828424 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7h6r2" event={"ID":"a89db488-2448-4a65-82ee-08472553ad38","Type":"ContainerStarted","Data":"a8c19460121577cd3ca1b78869fc5d2d4bf21485151a50de54f6354e85812daf"} Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.831268 4781 generic.go:334] "Generic (PLEG): container finished" podID="e2718c51-467e-407a-99b1-266956fcacfa" containerID="31948cc1c7fdc3245b056d8db87415ad886b621d820d68ecbbf81b0725c2cd37" exitCode=0 Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.831505 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dd8rn" event={"ID":"e2718c51-467e-407a-99b1-266956fcacfa","Type":"ContainerDied","Data":"31948cc1c7fdc3245b056d8db87415ad886b621d820d68ecbbf81b0725c2cd37"} Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.845976 4781 generic.go:334] "Generic (PLEG): container finished" podID="8a44acc8-6e3c-4643-9b69-34f54b53c188" containerID="2626694bb2556bccbe5290d2d9b12d69989c1d612b985be1232da4748cb26470" exitCode=143 Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.846065 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a44acc8-6e3c-4643-9b69-34f54b53c188","Type":"ContainerDied","Data":"2626694bb2556bccbe5290d2d9b12d69989c1d612b985be1232da4748cb26470"} Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.853937 4781 generic.go:334] "Generic (PLEG): container finished" podID="a1bda83d-ff07-4b0c-a978-a22373590d0f" containerID="f98b7dc49839073c5670111481c853628d81568b06c050e36f0febeb821fa3ef" exitCode=0 Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.854008 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1694-account-create-update-9dbhz" event={"ID":"a1bda83d-ff07-4b0c-a978-a22373590d0f","Type":"ContainerDied","Data":"f98b7dc49839073c5670111481c853628d81568b06c050e36f0febeb821fa3ef"} Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.854033 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1694-account-create-update-9dbhz" event={"ID":"a1bda83d-ff07-4b0c-a978-a22373590d0f","Type":"ContainerStarted","Data":"525c1d009935ae9c2769e367a4a7628dda0d4527e8d139b31dba9e3673c88a59"} Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.870152 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6592\" (UniqueName: \"kubernetes.io/projected/5a492eac-186c-4c73-9f70-59f989ea3169-kube-api-access-c6592\") pod \"redhat-operators-zh57t\" (UID: \"5a492eac-186c-4c73-9f70-59f989ea3169\") " pod="openshift-marketplace/redhat-operators-zh57t" Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.870248 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a492eac-186c-4c73-9f70-59f989ea3169-catalog-content\") pod \"redhat-operators-zh57t\" (UID: \"5a492eac-186c-4c73-9f70-59f989ea3169\") " pod="openshift-marketplace/redhat-operators-zh57t" Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.870339 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a492eac-186c-4c73-9f70-59f989ea3169-utilities\") pod \"redhat-operators-zh57t\" (UID: \"5a492eac-186c-4c73-9f70-59f989ea3169\") " pod="openshift-marketplace/redhat-operators-zh57t" Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.871075 4781 generic.go:334] "Generic (PLEG): container finished" podID="94a18aa9-0687-4427-942c-d343a3f0c5f4" containerID="0e9d31f865f2281ef6607d410d491ab19970c696b96d06a61b5d1f515e6d68da" exitCode=0 Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.871243 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8af6-account-create-update-8c6xl" event={"ID":"94a18aa9-0687-4427-942c-d343a3f0c5f4","Type":"ContainerDied","Data":"0e9d31f865f2281ef6607d410d491ab19970c696b96d06a61b5d1f515e6d68da"} Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.871267 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8af6-account-create-update-8c6xl" event={"ID":"94a18aa9-0687-4427-942c-d343a3f0c5f4","Type":"ContainerStarted","Data":"c2335889bd92343307f04ceecb6409ed1177593079a416c619db54cec4d2da2e"} Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.875881 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4e2d3890-4772-476e-9850-fdb32111b87a","Type":"ContainerStarted","Data":"cfc7ba1b4520888d19bb3acd801fc3e4d29b4bf9f70b3c91d1e01c6d11971e97"} Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.971882 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a492eac-186c-4c73-9f70-59f989ea3169-catalog-content\") pod \"redhat-operators-zh57t\" (UID: \"5a492eac-186c-4c73-9f70-59f989ea3169\") " pod="openshift-marketplace/redhat-operators-zh57t" Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.971997 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a492eac-186c-4c73-9f70-59f989ea3169-utilities\") pod \"redhat-operators-zh57t\" (UID: \"5a492eac-186c-4c73-9f70-59f989ea3169\") " pod="openshift-marketplace/redhat-operators-zh57t" Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.972111 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6592\" (UniqueName: \"kubernetes.io/projected/5a492eac-186c-4c73-9f70-59f989ea3169-kube-api-access-c6592\") pod \"redhat-operators-zh57t\" (UID: \"5a492eac-186c-4c73-9f70-59f989ea3169\") " pod="openshift-marketplace/redhat-operators-zh57t" Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.973663 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a492eac-186c-4c73-9f70-59f989ea3169-utilities\") pod \"redhat-operators-zh57t\" (UID: \"5a492eac-186c-4c73-9f70-59f989ea3169\") " pod="openshift-marketplace/redhat-operators-zh57t" Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.974957 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a492eac-186c-4c73-9f70-59f989ea3169-catalog-content\") pod \"redhat-operators-zh57t\" (UID: \"5a492eac-186c-4c73-9f70-59f989ea3169\") " pod="openshift-marketplace/redhat-operators-zh57t" Dec 02 09:46:00 crc kubenswrapper[4781]: I1202 09:46:00.999240 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6592\" (UniqueName: \"kubernetes.io/projected/5a492eac-186c-4c73-9f70-59f989ea3169-kube-api-access-c6592\") pod \"redhat-operators-zh57t\" (UID: \"5a492eac-186c-4c73-9f70-59f989ea3169\") " pod="openshift-marketplace/redhat-operators-zh57t" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.190031 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.246409 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zh57t" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.269441 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-f677cfc78-nzcxt"] Dec 02 09:46:01 crc kubenswrapper[4781]: E1202 09:46:01.269995 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="269490ed-f426-4577-b60d-695881e29aac" containerName="glance-log" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.270011 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="269490ed-f426-4577-b60d-695881e29aac" containerName="glance-log" Dec 02 09:46:01 crc kubenswrapper[4781]: E1202 09:46:01.270044 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="269490ed-f426-4577-b60d-695881e29aac" containerName="glance-httpd" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.270051 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="269490ed-f426-4577-b60d-695881e29aac" containerName="glance-httpd" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.270302 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="269490ed-f426-4577-b60d-695881e29aac" containerName="glance-httpd" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.270338 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="269490ed-f426-4577-b60d-695881e29aac" containerName="glance-log" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.274866 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.277464 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-combined-ca-bundle\") pod \"269490ed-f426-4577-b60d-695881e29aac\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.277521 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"269490ed-f426-4577-b60d-695881e29aac\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.277580 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqjjd\" (UniqueName: \"kubernetes.io/projected/269490ed-f426-4577-b60d-695881e29aac-kube-api-access-zqjjd\") pod \"269490ed-f426-4577-b60d-695881e29aac\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.277610 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-config-data\") pod \"269490ed-f426-4577-b60d-695881e29aac\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.277696 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/269490ed-f426-4577-b60d-695881e29aac-httpd-run\") pod \"269490ed-f426-4577-b60d-695881e29aac\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.277732 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/269490ed-f426-4577-b60d-695881e29aac-logs\") pod \"269490ed-f426-4577-b60d-695881e29aac\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.277760 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-public-tls-certs\") pod \"269490ed-f426-4577-b60d-695881e29aac\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.277868 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-scripts\") pod \"269490ed-f426-4577-b60d-695881e29aac\" (UID: \"269490ed-f426-4577-b60d-695881e29aac\") " Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.278166 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.280886 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/269490ed-f426-4577-b60d-695881e29aac-logs" (OuterVolumeSpecName: "logs") pod "269490ed-f426-4577-b60d-695881e29aac" (UID: "269490ed-f426-4577-b60d-695881e29aac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.281092 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/269490ed-f426-4577-b60d-695881e29aac-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "269490ed-f426-4577-b60d-695881e29aac" (UID: "269490ed-f426-4577-b60d-695881e29aac"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.281748 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.307262 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/269490ed-f426-4577-b60d-695881e29aac-kube-api-access-zqjjd" (OuterVolumeSpecName: "kube-api-access-zqjjd") pod "269490ed-f426-4577-b60d-695881e29aac" (UID: "269490ed-f426-4577-b60d-695881e29aac"). InnerVolumeSpecName "kube-api-access-zqjjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.310071 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-scripts" (OuterVolumeSpecName: "scripts") pod "269490ed-f426-4577-b60d-695881e29aac" (UID: "269490ed-f426-4577-b60d-695881e29aac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.321323 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "269490ed-f426-4577-b60d-695881e29aac" (UID: "269490ed-f426-4577-b60d-695881e29aac"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.359025 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-f677cfc78-nzcxt"] Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.383879 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17137e34-c042-4c3b-b11b-3e743e2a00b5-public-tls-certs\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.383957 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17137e34-c042-4c3b-b11b-3e743e2a00b5-config-data-custom\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.384018 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17137e34-c042-4c3b-b11b-3e743e2a00b5-config-data\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.384047 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qbkm\" (UniqueName: \"kubernetes.io/projected/17137e34-c042-4c3b-b11b-3e743e2a00b5-kube-api-access-6qbkm\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.384094 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17137e34-c042-4c3b-b11b-3e743e2a00b5-logs\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.384116 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17137e34-c042-4c3b-b11b-3e743e2a00b5-internal-tls-certs\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.384195 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17137e34-c042-4c3b-b11b-3e743e2a00b5-combined-ca-bundle\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.384250 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqjjd\" (UniqueName: \"kubernetes.io/projected/269490ed-f426-4577-b60d-695881e29aac-kube-api-access-zqjjd\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.384265 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/269490ed-f426-4577-b60d-695881e29aac-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.384277 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/269490ed-f426-4577-b60d-695881e29aac-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.384288 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.384309 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.459718 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-config-data" (OuterVolumeSpecName: "config-data") pod "269490ed-f426-4577-b60d-695881e29aac" (UID: "269490ed-f426-4577-b60d-695881e29aac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.460524 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "269490ed-f426-4577-b60d-695881e29aac" (UID: "269490ed-f426-4577-b60d-695881e29aac"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.466165 4781 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.466666 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "269490ed-f426-4577-b60d-695881e29aac" (UID: "269490ed-f426-4577-b60d-695881e29aac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.496529 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17137e34-c042-4c3b-b11b-3e743e2a00b5-public-tls-certs\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.496604 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17137e34-c042-4c3b-b11b-3e743e2a00b5-config-data-custom\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.496721 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17137e34-c042-4c3b-b11b-3e743e2a00b5-config-data\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.496776 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qbkm\" (UniqueName: \"kubernetes.io/projected/17137e34-c042-4c3b-b11b-3e743e2a00b5-kube-api-access-6qbkm\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.496852 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17137e34-c042-4c3b-b11b-3e743e2a00b5-logs\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.496879 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17137e34-c042-4c3b-b11b-3e743e2a00b5-internal-tls-certs\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.497048 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17137e34-c042-4c3b-b11b-3e743e2a00b5-combined-ca-bundle\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.497146 4781 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.497162 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.497174 4781 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.497186 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/269490ed-f426-4577-b60d-695881e29aac-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.503578 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17137e34-c042-4c3b-b11b-3e743e2a00b5-logs\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.511442 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17137e34-c042-4c3b-b11b-3e743e2a00b5-combined-ca-bundle\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.521509 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17137e34-c042-4c3b-b11b-3e743e2a00b5-config-data\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.518351 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17137e34-c042-4c3b-b11b-3e743e2a00b5-config-data-custom\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.551214 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17137e34-c042-4c3b-b11b-3e743e2a00b5-internal-tls-certs\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.551299 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17137e34-c042-4c3b-b11b-3e743e2a00b5-public-tls-certs\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.565000 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qbkm\" (UniqueName: \"kubernetes.io/projected/17137e34-c042-4c3b-b11b-3e743e2a00b5-kube-api-access-6qbkm\") pod \"barbican-api-f677cfc78-nzcxt\" (UID: \"17137e34-c042-4c3b-b11b-3e743e2a00b5\") " pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.620411 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.656468 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dd8rn" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.702225 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqn2x\" (UniqueName: \"kubernetes.io/projected/e2718c51-467e-407a-99b1-266956fcacfa-kube-api-access-wqn2x\") pod \"e2718c51-467e-407a-99b1-266956fcacfa\" (UID: \"e2718c51-467e-407a-99b1-266956fcacfa\") " Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.702393 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2718c51-467e-407a-99b1-266956fcacfa-operator-scripts\") pod \"e2718c51-467e-407a-99b1-266956fcacfa\" (UID: \"e2718c51-467e-407a-99b1-266956fcacfa\") " Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.706364 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2718c51-467e-407a-99b1-266956fcacfa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e2718c51-467e-407a-99b1-266956fcacfa" (UID: "e2718c51-467e-407a-99b1-266956fcacfa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.715107 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2718c51-467e-407a-99b1-266956fcacfa-kube-api-access-wqn2x" (OuterVolumeSpecName: "kube-api-access-wqn2x") pod "e2718c51-467e-407a-99b1-266956fcacfa" (UID: "e2718c51-467e-407a-99b1-266956fcacfa"). InnerVolumeSpecName "kube-api-access-wqn2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.807631 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqn2x\" (UniqueName: \"kubernetes.io/projected/e2718c51-467e-407a-99b1-266956fcacfa-kube-api-access-wqn2x\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.807674 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2718c51-467e-407a-99b1-266956fcacfa-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.921803 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6q5kh" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.932747 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4e2d3890-4772-476e-9850-fdb32111b87a","Type":"ContainerStarted","Data":"aad502759669794dd1d9475f0fe8158b04d471a73e7bc50c9ae552be5aceb7ec"} Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.951298 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dd8rn" event={"ID":"e2718c51-467e-407a-99b1-266956fcacfa","Type":"ContainerDied","Data":"40d8da9ad00f4baa9b2e9ea35e1afdf1faa774e31bea74e9a6ac56dbc832733b"} Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.951348 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40d8da9ad00f4baa9b2e9ea35e1afdf1faa774e31bea74e9a6ac56dbc832733b" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.951434 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dd8rn" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.958071 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9cbdc4d89-pkh64" event={"ID":"f074040a-1272-492e-b149-3a0a6cc89efd","Type":"ContainerStarted","Data":"bcb4a0b2840b9296a1624da38913c0e9f2e1eb809f7ddfa5fcb40642df8c8950"} Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.959455 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.978810 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6q5kh" event={"ID":"6140504f-30c8-4ec8-9659-fc4f2795c6a2","Type":"ContainerDied","Data":"eacdec5868e09b055c3fb5495fdbc212386a23e358266720b94e90ba73ce7d23"} Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.978869 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eacdec5868e09b055c3fb5495fdbc212386a23e358266720b94e90ba73ce7d23" Dec 02 09:46:01 crc kubenswrapper[4781]: I1202 09:46:01.978897 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6q5kh" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.004207 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-9cbdc4d89-pkh64" podStartSLOduration=4.004186143 podStartE2EDuration="4.004186143s" podCreationTimestamp="2025-12-02 09:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:46:01.995103572 +0000 UTC m=+1524.818977451" watchObservedRunningTime="2025-12-02 09:46:02.004186143 +0000 UTC m=+1524.828060022" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.014538 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6140504f-30c8-4ec8-9659-fc4f2795c6a2-operator-scripts\") pod \"6140504f-30c8-4ec8-9659-fc4f2795c6a2\" (UID: \"6140504f-30c8-4ec8-9659-fc4f2795c6a2\") " Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.014813 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkl9w\" (UniqueName: \"kubernetes.io/projected/6140504f-30c8-4ec8-9659-fc4f2795c6a2-kube-api-access-hkl9w\") pod \"6140504f-30c8-4ec8-9659-fc4f2795c6a2\" (UID: \"6140504f-30c8-4ec8-9659-fc4f2795c6a2\") " Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.015755 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6140504f-30c8-4ec8-9659-fc4f2795c6a2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6140504f-30c8-4ec8-9659-fc4f2795c6a2" (UID: "6140504f-30c8-4ec8-9659-fc4f2795c6a2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.025684 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"269490ed-f426-4577-b60d-695881e29aac","Type":"ContainerDied","Data":"90a5675073e712ecefae3610b578c1e9effbdc04122f19049db08c76066ec5bb"} Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.025750 4781 scope.go:117] "RemoveContainer" containerID="a974bce8f6f1af663c342580c805ddbaa6097b40912021df63435e438216c135" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.025909 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.039011 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6140504f-30c8-4ec8-9659-fc4f2795c6a2-kube-api-access-hkl9w" (OuterVolumeSpecName: "kube-api-access-hkl9w") pod "6140504f-30c8-4ec8-9659-fc4f2795c6a2" (UID: "6140504f-30c8-4ec8-9659-fc4f2795c6a2"). InnerVolumeSpecName "kube-api-access-hkl9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.069316 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.102745 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.117811 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkl9w\" (UniqueName: \"kubernetes.io/projected/6140504f-30c8-4ec8-9659-fc4f2795c6a2-kube-api-access-hkl9w\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.117849 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6140504f-30c8-4ec8-9659-fc4f2795c6a2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.179501 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.230192 4781 scope.go:117] "RemoveContainer" containerID="e0960176bdda8e2ba82d90641ba566c3b6c5c7a6ec61704be70fbd7a1daf1d81" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.252102 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zh57t"] Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.287622 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:46:02 crc kubenswrapper[4781]: E1202 09:46:02.288060 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2718c51-467e-407a-99b1-266956fcacfa" containerName="mariadb-database-create" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.288076 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2718c51-467e-407a-99b1-266956fcacfa" containerName="mariadb-database-create" Dec 02 09:46:02 crc kubenswrapper[4781]: E1202 09:46:02.288116 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6140504f-30c8-4ec8-9659-fc4f2795c6a2" containerName="mariadb-database-create" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.288122 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6140504f-30c8-4ec8-9659-fc4f2795c6a2" containerName="mariadb-database-create" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.288315 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6140504f-30c8-4ec8-9659-fc4f2795c6a2" containerName="mariadb-database-create" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.288329 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2718c51-467e-407a-99b1-266956fcacfa" containerName="mariadb-database-create" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.289344 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.295799 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.295857 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.341337 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.435986 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-f677cfc78-nzcxt"] Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.442133 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.442205 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-scripts\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.442242 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.442276 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q47qh\" (UniqueName: \"kubernetes.io/projected/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-kube-api-access-q47qh\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.442330 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.442392 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-logs\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.442419 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.442492 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-config-data\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.549823 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-logs\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.549874 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.549916 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-config-data\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.549982 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.550021 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-scripts\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.550042 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.550065 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q47qh\" (UniqueName: \"kubernetes.io/projected/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-kube-api-access-q47qh\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.550100 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.550741 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.550739 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-logs\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.552070 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.559180 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.563605 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.564941 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-scripts\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.575376 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-config-data\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.585765 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q47qh\" (UniqueName: \"kubernetes.io/projected/2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34-kube-api-access-q47qh\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.640836 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34\") " pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.784026 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.849538 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.902921 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.976975 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2wf99"] Dec 02 09:46:02 crc kubenswrapper[4781]: I1202 09:46:02.977551 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" podUID="74200d1e-84a7-4f23-b858-e24dcb955dbb" containerName="dnsmasq-dns" containerID="cri-o://8d83e5fd74e7de7d8cc031d27e869372a76d4e0f82ffc074e799d9426b66906b" gracePeriod=10 Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.070511 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f677cfc78-nzcxt" event={"ID":"17137e34-c042-4c3b-b11b-3e743e2a00b5","Type":"ContainerStarted","Data":"3b74e2be975bae20d778d6214c1c19a815fa1b767dce868f284a540fbc98297e"} Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.075711 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5a93-account-create-update-48m59" event={"ID":"9416e5ad-d0d8-404a-8fe8-8a5648030542","Type":"ContainerDied","Data":"b0b4cc38f09ef881f05c474352969ce611aaec046b43ea68a31d9eef8d73369f"} Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.075750 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0b4cc38f09ef881f05c474352969ce611aaec046b43ea68a31d9eef8d73369f" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.122495 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh57t" event={"ID":"5a492eac-186c-4c73-9f70-59f989ea3169","Type":"ContainerStarted","Data":"a71347466b2b6547c76ea2ca5398b465bfabcfea7d5c39a988a00fcbfeb31c46"} Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.248680 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 09:46:03 crc kubenswrapper[4781]: E1202 09:46:03.318440 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2718c51_467e_407a_99b1_266956fcacfa.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6140504f_30c8_4ec8_9659_fc4f2795c6a2.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6140504f_30c8_4ec8_9659_fc4f2795c6a2.slice/crio-eacdec5868e09b055c3fb5495fdbc212386a23e358266720b94e90ba73ce7d23\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a44acc8_6e3c_4643_9b69_34f54b53c188.slice/crio-conmon-3ccee59f51f4b55106b87ba1b23d1a23f762a2b56edb990a2065c1a6fc1bcfdb.scope\": RecentStats: unable to find data in memory cache]" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.442561 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5a93-account-create-update-48m59" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.491189 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7h6r2" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.502478 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8af6-account-create-update-8c6xl" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.523914 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1694-account-create-update-9dbhz" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.562826 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="269490ed-f426-4577-b60d-695881e29aac" path="/var/lib/kubelet/pods/269490ed-f426-4577-b60d-695881e29aac/volumes" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.600704 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq959\" (UniqueName: \"kubernetes.io/projected/94a18aa9-0687-4427-942c-d343a3f0c5f4-kube-api-access-pq959\") pod \"94a18aa9-0687-4427-942c-d343a3f0c5f4\" (UID: \"94a18aa9-0687-4427-942c-d343a3f0c5f4\") " Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.600766 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9416e5ad-d0d8-404a-8fe8-8a5648030542-operator-scripts\") pod \"9416e5ad-d0d8-404a-8fe8-8a5648030542\" (UID: \"9416e5ad-d0d8-404a-8fe8-8a5648030542\") " Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.601023 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a89db488-2448-4a65-82ee-08472553ad38-operator-scripts\") pod \"a89db488-2448-4a65-82ee-08472553ad38\" (UID: \"a89db488-2448-4a65-82ee-08472553ad38\") " Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.601066 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2r48\" (UniqueName: \"kubernetes.io/projected/9416e5ad-d0d8-404a-8fe8-8a5648030542-kube-api-access-m2r48\") pod \"9416e5ad-d0d8-404a-8fe8-8a5648030542\" (UID: \"9416e5ad-d0d8-404a-8fe8-8a5648030542\") " Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.601105 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94a18aa9-0687-4427-942c-d343a3f0c5f4-operator-scripts\") pod \"94a18aa9-0687-4427-942c-d343a3f0c5f4\" (UID: \"94a18aa9-0687-4427-942c-d343a3f0c5f4\") " Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.601158 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2dlb\" (UniqueName: \"kubernetes.io/projected/a89db488-2448-4a65-82ee-08472553ad38-kube-api-access-q2dlb\") pod \"a89db488-2448-4a65-82ee-08472553ad38\" (UID: \"a89db488-2448-4a65-82ee-08472553ad38\") " Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.609972 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a89db488-2448-4a65-82ee-08472553ad38-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a89db488-2448-4a65-82ee-08472553ad38" (UID: "a89db488-2448-4a65-82ee-08472553ad38"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.610494 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9416e5ad-d0d8-404a-8fe8-8a5648030542-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9416e5ad-d0d8-404a-8fe8-8a5648030542" (UID: "9416e5ad-d0d8-404a-8fe8-8a5648030542"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.616079 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a18aa9-0687-4427-942c-d343a3f0c5f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94a18aa9-0687-4427-942c-d343a3f0c5f4" (UID: "94a18aa9-0687-4427-942c-d343a3f0c5f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.623735 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a18aa9-0687-4427-942c-d343a3f0c5f4-kube-api-access-pq959" (OuterVolumeSpecName: "kube-api-access-pq959") pod "94a18aa9-0687-4427-942c-d343a3f0c5f4" (UID: "94a18aa9-0687-4427-942c-d343a3f0c5f4"). InnerVolumeSpecName "kube-api-access-pq959". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.629819 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9416e5ad-d0d8-404a-8fe8-8a5648030542-kube-api-access-m2r48" (OuterVolumeSpecName: "kube-api-access-m2r48") pod "9416e5ad-d0d8-404a-8fe8-8a5648030542" (UID: "9416e5ad-d0d8-404a-8fe8-8a5648030542"). InnerVolumeSpecName "kube-api-access-m2r48". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.686269 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a89db488-2448-4a65-82ee-08472553ad38-kube-api-access-q2dlb" (OuterVolumeSpecName: "kube-api-access-q2dlb") pod "a89db488-2448-4a65-82ee-08472553ad38" (UID: "a89db488-2448-4a65-82ee-08472553ad38"). InnerVolumeSpecName "kube-api-access-q2dlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.707614 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1bda83d-ff07-4b0c-a978-a22373590d0f-operator-scripts\") pod \"a1bda83d-ff07-4b0c-a978-a22373590d0f\" (UID: \"a1bda83d-ff07-4b0c-a978-a22373590d0f\") " Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.707907 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nprzk\" (UniqueName: \"kubernetes.io/projected/a1bda83d-ff07-4b0c-a978-a22373590d0f-kube-api-access-nprzk\") pod \"a1bda83d-ff07-4b0c-a978-a22373590d0f\" (UID: \"a1bda83d-ff07-4b0c-a978-a22373590d0f\") " Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.708493 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a89db488-2448-4a65-82ee-08472553ad38-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.708520 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2r48\" (UniqueName: \"kubernetes.io/projected/9416e5ad-d0d8-404a-8fe8-8a5648030542-kube-api-access-m2r48\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.708534 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94a18aa9-0687-4427-942c-d343a3f0c5f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.708559 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2dlb\" (UniqueName: \"kubernetes.io/projected/a89db488-2448-4a65-82ee-08472553ad38-kube-api-access-q2dlb\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.708572 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq959\" (UniqueName: \"kubernetes.io/projected/94a18aa9-0687-4427-942c-d343a3f0c5f4-kube-api-access-pq959\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.708585 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9416e5ad-d0d8-404a-8fe8-8a5648030542-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.709473 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1bda83d-ff07-4b0c-a978-a22373590d0f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a1bda83d-ff07-4b0c-a978-a22373590d0f" (UID: "a1bda83d-ff07-4b0c-a978-a22373590d0f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.744470 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1bda83d-ff07-4b0c-a978-a22373590d0f-kube-api-access-nprzk" (OuterVolumeSpecName: "kube-api-access-nprzk") pod "a1bda83d-ff07-4b0c-a978-a22373590d0f" (UID: "a1bda83d-ff07-4b0c-a978-a22373590d0f"). InnerVolumeSpecName "kube-api-access-nprzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.811803 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nprzk\" (UniqueName: \"kubernetes.io/projected/a1bda83d-ff07-4b0c-a978-a22373590d0f-kube-api-access-nprzk\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:03 crc kubenswrapper[4781]: I1202 09:46:03.812841 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1bda83d-ff07-4b0c-a978-a22373590d0f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.157162 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.175408 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1694-account-create-update-9dbhz" event={"ID":"a1bda83d-ff07-4b0c-a978-a22373590d0f","Type":"ContainerDied","Data":"525c1d009935ae9c2769e367a4a7628dda0d4527e8d139b31dba9e3673c88a59"} Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.175483 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="525c1d009935ae9c2769e367a4a7628dda0d4527e8d139b31dba9e3673c88a59" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.175643 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1694-account-create-update-9dbhz" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.184303 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f677cfc78-nzcxt" event={"ID":"17137e34-c042-4c3b-b11b-3e743e2a00b5","Type":"ContainerStarted","Data":"648c522c9545c7f3e19fb4b628bcedf184cfa0065aee547946f8815ef02374ce"} Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.230513 4781 generic.go:334] "Generic (PLEG): container finished" podID="74200d1e-84a7-4f23-b858-e24dcb955dbb" containerID="8d83e5fd74e7de7d8cc031d27e869372a76d4e0f82ffc074e799d9426b66906b" exitCode=0 Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.230674 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" event={"ID":"74200d1e-84a7-4f23-b858-e24dcb955dbb","Type":"ContainerDied","Data":"8d83e5fd74e7de7d8cc031d27e869372a76d4e0f82ffc074e799d9426b66906b"} Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.231539 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" event={"ID":"74200d1e-84a7-4f23-b858-e24dcb955dbb","Type":"ContainerDied","Data":"628e0066d922a147919476fa6a3bdc58a006c2c170b8f46068319d6a19cceba0"} Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.231628 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="628e0066d922a147919476fa6a3bdc58a006c2c170b8f46068319d6a19cceba0" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.255328 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.258288 4781 generic.go:334] "Generic (PLEG): container finished" podID="8a44acc8-6e3c-4643-9b69-34f54b53c188" containerID="3ccee59f51f4b55106b87ba1b23d1a23f762a2b56edb990a2065c1a6fc1bcfdb" exitCode=0 Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.258377 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a44acc8-6e3c-4643-9b69-34f54b53c188","Type":"ContainerDied","Data":"3ccee59f51f4b55106b87ba1b23d1a23f762a2b56edb990a2065c1a6fc1bcfdb"} Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.258410 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a44acc8-6e3c-4643-9b69-34f54b53c188","Type":"ContainerDied","Data":"3baf14ffbdb20ebdb7786666412a2b63c3a0fb9260c5bf2f8181549180613ef8"} Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.258432 4781 scope.go:117] "RemoveContainer" containerID="3ccee59f51f4b55106b87ba1b23d1a23f762a2b56edb990a2065c1a6fc1bcfdb" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.277933 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.278262 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4e2d3890-4772-476e-9850-fdb32111b87a","Type":"ContainerStarted","Data":"cf4ca05e6c06ec83b8881f69b275d2e2c9bd01e0d956c9a69eefec7bdb199e1a"} Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.278685 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.307967 4781 generic.go:334] "Generic (PLEG): container finished" podID="5a492eac-186c-4c73-9f70-59f989ea3169" containerID="3ed58ad9d1a447eac383cdca6a5079aae04ebfc3236873e91bccc56237476d07" exitCode=0 Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.308089 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh57t" event={"ID":"5a492eac-186c-4c73-9f70-59f989ea3169","Type":"ContainerDied","Data":"3ed58ad9d1a447eac383cdca6a5079aae04ebfc3236873e91bccc56237476d07"} Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.333631 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34","Type":"ContainerStarted","Data":"06f416e161e2a5bb77c9d49c07358141d17cdc83acfb8a8d795a85b6c3b2888c"} Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.356626 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7h6r2" event={"ID":"a89db488-2448-4a65-82ee-08472553ad38","Type":"ContainerDied","Data":"a8c19460121577cd3ca1b78869fc5d2d4bf21485151a50de54f6354e85812daf"} Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.356666 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8c19460121577cd3ca1b78869fc5d2d4bf21485151a50de54f6354e85812daf" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.356749 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7h6r2" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.367063 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8af6-account-create-update-8c6xl" event={"ID":"94a18aa9-0687-4427-942c-d343a3f0c5f4","Type":"ContainerDied","Data":"c2335889bd92343307f04ceecb6409ed1177593079a416c619db54cec4d2da2e"} Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.367119 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2335889bd92343307f04ceecb6409ed1177593079a416c619db54cec4d2da2e" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.367215 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8af6-account-create-update-8c6xl" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.368182 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2927c2fe-9023-4be6-a44a-6701f24c499a" containerName="cinder-scheduler" containerID="cri-o://b96f5c817e9dd75d8e2b5ac9940d95cebe28fb7655e48cf280ad05627084a444" gracePeriod=30 Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.368312 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5a93-account-create-update-48m59" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.370691 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2927c2fe-9023-4be6-a44a-6701f24c499a" containerName="probe" containerID="cri-o://271eaafcaef3b1e096ae33b029d7ad759e5eca9f1a9a61fd5beeda21269a5651" gracePeriod=30 Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.420503 4781 scope.go:117] "RemoveContainer" containerID="2626694bb2556bccbe5290d2d9b12d69989c1d612b985be1232da4748cb26470" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.433425 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-ovsdbserver-sb\") pod \"74200d1e-84a7-4f23-b858-e24dcb955dbb\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.433495 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-internal-tls-certs\") pod \"8a44acc8-6e3c-4643-9b69-34f54b53c188\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.433521 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a44acc8-6e3c-4643-9b69-34f54b53c188-httpd-run\") pod \"8a44acc8-6e3c-4643-9b69-34f54b53c188\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.433553 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-dns-svc\") pod \"74200d1e-84a7-4f23-b858-e24dcb955dbb\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.433571 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a44acc8-6e3c-4643-9b69-34f54b53c188-logs\") pod \"8a44acc8-6e3c-4643-9b69-34f54b53c188\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.433594 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7lv5\" (UniqueName: \"kubernetes.io/projected/8a44acc8-6e3c-4643-9b69-34f54b53c188-kube-api-access-c7lv5\") pod \"8a44acc8-6e3c-4643-9b69-34f54b53c188\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.433642 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlfcs\" (UniqueName: \"kubernetes.io/projected/74200d1e-84a7-4f23-b858-e24dcb955dbb-kube-api-access-wlfcs\") pod \"74200d1e-84a7-4f23-b858-e24dcb955dbb\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.433693 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-combined-ca-bundle\") pod \"8a44acc8-6e3c-4643-9b69-34f54b53c188\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.433717 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-ovsdbserver-nb\") pod \"74200d1e-84a7-4f23-b858-e24dcb955dbb\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.433733 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"8a44acc8-6e3c-4643-9b69-34f54b53c188\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.433802 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-config-data\") pod \"8a44acc8-6e3c-4643-9b69-34f54b53c188\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.433841 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-scripts\") pod \"8a44acc8-6e3c-4643-9b69-34f54b53c188\" (UID: \"8a44acc8-6e3c-4643-9b69-34f54b53c188\") " Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.433867 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-dns-swift-storage-0\") pod \"74200d1e-84a7-4f23-b858-e24dcb955dbb\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.433885 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-config\") pod \"74200d1e-84a7-4f23-b858-e24dcb955dbb\" (UID: \"74200d1e-84a7-4f23-b858-e24dcb955dbb\") " Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.435402 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.435375569 podStartE2EDuration="7.435375569s" podCreationTimestamp="2025-12-02 09:45:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:46:04.356691286 +0000 UTC m=+1527.180565165" watchObservedRunningTime="2025-12-02 09:46:04.435375569 +0000 UTC m=+1527.259249448" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.442978 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a44acc8-6e3c-4643-9b69-34f54b53c188-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8a44acc8-6e3c-4643-9b69-34f54b53c188" (UID: "8a44acc8-6e3c-4643-9b69-34f54b53c188"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.450492 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74200d1e-84a7-4f23-b858-e24dcb955dbb-kube-api-access-wlfcs" (OuterVolumeSpecName: "kube-api-access-wlfcs") pod "74200d1e-84a7-4f23-b858-e24dcb955dbb" (UID: "74200d1e-84a7-4f23-b858-e24dcb955dbb"). InnerVolumeSpecName "kube-api-access-wlfcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.450906 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a44acc8-6e3c-4643-9b69-34f54b53c188-logs" (OuterVolumeSpecName: "logs") pod "8a44acc8-6e3c-4643-9b69-34f54b53c188" (UID: "8a44acc8-6e3c-4643-9b69-34f54b53c188"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.467872 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-scripts" (OuterVolumeSpecName: "scripts") pod "8a44acc8-6e3c-4643-9b69-34f54b53c188" (UID: "8a44acc8-6e3c-4643-9b69-34f54b53c188"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.476191 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a44acc8-6e3c-4643-9b69-34f54b53c188-kube-api-access-c7lv5" (OuterVolumeSpecName: "kube-api-access-c7lv5") pod "8a44acc8-6e3c-4643-9b69-34f54b53c188" (UID: "8a44acc8-6e3c-4643-9b69-34f54b53c188"). InnerVolumeSpecName "kube-api-access-c7lv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.501088 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "8a44acc8-6e3c-4643-9b69-34f54b53c188" (UID: "8a44acc8-6e3c-4643-9b69-34f54b53c188"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.511665 4781 scope.go:117] "RemoveContainer" containerID="3ccee59f51f4b55106b87ba1b23d1a23f762a2b56edb990a2065c1a6fc1bcfdb" Dec 02 09:46:04 crc kubenswrapper[4781]: E1202 09:46:04.517259 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ccee59f51f4b55106b87ba1b23d1a23f762a2b56edb990a2065c1a6fc1bcfdb\": container with ID starting with 3ccee59f51f4b55106b87ba1b23d1a23f762a2b56edb990a2065c1a6fc1bcfdb not found: ID does not exist" containerID="3ccee59f51f4b55106b87ba1b23d1a23f762a2b56edb990a2065c1a6fc1bcfdb" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.517316 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ccee59f51f4b55106b87ba1b23d1a23f762a2b56edb990a2065c1a6fc1bcfdb"} err="failed to get container status \"3ccee59f51f4b55106b87ba1b23d1a23f762a2b56edb990a2065c1a6fc1bcfdb\": rpc error: code = NotFound desc = could not find container \"3ccee59f51f4b55106b87ba1b23d1a23f762a2b56edb990a2065c1a6fc1bcfdb\": container with ID starting with 3ccee59f51f4b55106b87ba1b23d1a23f762a2b56edb990a2065c1a6fc1bcfdb not found: ID does not exist" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.517347 4781 scope.go:117] "RemoveContainer" containerID="2626694bb2556bccbe5290d2d9b12d69989c1d612b985be1232da4748cb26470" Dec 02 09:46:04 crc kubenswrapper[4781]: E1202 09:46:04.524609 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2626694bb2556bccbe5290d2d9b12d69989c1d612b985be1232da4748cb26470\": container with ID starting with 2626694bb2556bccbe5290d2d9b12d69989c1d612b985be1232da4748cb26470 not found: ID does not exist" containerID="2626694bb2556bccbe5290d2d9b12d69989c1d612b985be1232da4748cb26470" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.524667 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2626694bb2556bccbe5290d2d9b12d69989c1d612b985be1232da4748cb26470"} err="failed to get container status \"2626694bb2556bccbe5290d2d9b12d69989c1d612b985be1232da4748cb26470\": rpc error: code = NotFound desc = could not find container \"2626694bb2556bccbe5290d2d9b12d69989c1d612b985be1232da4748cb26470\": container with ID starting with 2626694bb2556bccbe5290d2d9b12d69989c1d612b985be1232da4748cb26470 not found: ID does not exist" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.536747 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.536785 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a44acc8-6e3c-4643-9b69-34f54b53c188-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.536796 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a44acc8-6e3c-4643-9b69-34f54b53c188-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.536807 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7lv5\" (UniqueName: \"kubernetes.io/projected/8a44acc8-6e3c-4643-9b69-34f54b53c188-kube-api-access-c7lv5\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.536820 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlfcs\" (UniqueName: \"kubernetes.io/projected/74200d1e-84a7-4f23-b858-e24dcb955dbb-kube-api-access-wlfcs\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.536846 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.546523 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a44acc8-6e3c-4643-9b69-34f54b53c188" (UID: "8a44acc8-6e3c-4643-9b69-34f54b53c188"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.632030 4781 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.646800 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.646839 4781 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.651113 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "74200d1e-84a7-4f23-b858-e24dcb955dbb" (UID: "74200d1e-84a7-4f23-b858-e24dcb955dbb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.660146 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-config-data" (OuterVolumeSpecName: "config-data") pod "8a44acc8-6e3c-4643-9b69-34f54b53c188" (UID: "8a44acc8-6e3c-4643-9b69-34f54b53c188"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.681291 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "74200d1e-84a7-4f23-b858-e24dcb955dbb" (UID: "74200d1e-84a7-4f23-b858-e24dcb955dbb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.697270 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8a44acc8-6e3c-4643-9b69-34f54b53c188" (UID: "8a44acc8-6e3c-4643-9b69-34f54b53c188"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.738494 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "74200d1e-84a7-4f23-b858-e24dcb955dbb" (UID: "74200d1e-84a7-4f23-b858-e24dcb955dbb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.753547 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-config" (OuterVolumeSpecName: "config") pod "74200d1e-84a7-4f23-b858-e24dcb955dbb" (UID: "74200d1e-84a7-4f23-b858-e24dcb955dbb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.754845 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.754871 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.754882 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.754891 4781 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a44acc8-6e3c-4643-9b69-34f54b53c188-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.754935 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.754944 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.766435 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "74200d1e-84a7-4f23-b858-e24dcb955dbb" (UID: "74200d1e-84a7-4f23-b858-e24dcb955dbb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:46:04 crc kubenswrapper[4781]: I1202 09:46:04.856479 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74200d1e-84a7-4f23-b858-e24dcb955dbb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.144133 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-56b57b86b8-kf8n9" podUID="22badd1d-b000-4870-86cd-1186fe2ed67d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.382057 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f677cfc78-nzcxt" event={"ID":"17137e34-c042-4c3b-b11b-3e743e2a00b5","Type":"ContainerStarted","Data":"8f38ea6c108074c79faa757b492239f9aa92b03ba5a62e91310f40e12505e3da"} Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.383300 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.383331 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.387731 4781 generic.go:334] "Generic (PLEG): container finished" podID="2927c2fe-9023-4be6-a44a-6701f24c499a" containerID="271eaafcaef3b1e096ae33b029d7ad759e5eca9f1a9a61fd5beeda21269a5651" exitCode=0 Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.388025 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2927c2fe-9023-4be6-a44a-6701f24c499a","Type":"ContainerDied","Data":"271eaafcaef3b1e096ae33b029d7ad759e5eca9f1a9a61fd5beeda21269a5651"} Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.389516 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34","Type":"ContainerStarted","Data":"d0ec45434cae6420f82073631e81f9c5fd20fef4e1f6996b0ac7c3fd8afd7f51"} Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.391529 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.393108 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2wf99" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.410523 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-f677cfc78-nzcxt" podStartSLOduration=4.410505336 podStartE2EDuration="4.410505336s" podCreationTimestamp="2025-12-02 09:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:46:05.410065825 +0000 UTC m=+1528.233939704" watchObservedRunningTime="2025-12-02 09:46:05.410505336 +0000 UTC m=+1528.234379215" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.427478 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.570480 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2wf99"] Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.584998 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2wf99"] Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.593565 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.603231 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.615160 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:46:05 crc kubenswrapper[4781]: E1202 09:46:05.615573 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a44acc8-6e3c-4643-9b69-34f54b53c188" containerName="glance-log" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.615588 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a44acc8-6e3c-4643-9b69-34f54b53c188" containerName="glance-log" Dec 02 09:46:05 crc kubenswrapper[4781]: E1202 09:46:05.615604 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74200d1e-84a7-4f23-b858-e24dcb955dbb" containerName="init" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.615614 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="74200d1e-84a7-4f23-b858-e24dcb955dbb" containerName="init" Dec 02 09:46:05 crc kubenswrapper[4781]: E1202 09:46:05.615629 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1bda83d-ff07-4b0c-a978-a22373590d0f" containerName="mariadb-account-create-update" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.615640 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1bda83d-ff07-4b0c-a978-a22373590d0f" containerName="mariadb-account-create-update" Dec 02 09:46:05 crc kubenswrapper[4781]: E1202 09:46:05.615656 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a44acc8-6e3c-4643-9b69-34f54b53c188" containerName="glance-httpd" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.615662 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a44acc8-6e3c-4643-9b69-34f54b53c188" containerName="glance-httpd" Dec 02 09:46:05 crc kubenswrapper[4781]: E1202 09:46:05.615677 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74200d1e-84a7-4f23-b858-e24dcb955dbb" containerName="dnsmasq-dns" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.615682 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="74200d1e-84a7-4f23-b858-e24dcb955dbb" containerName="dnsmasq-dns" Dec 02 09:46:05 crc kubenswrapper[4781]: E1202 09:46:05.615695 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a18aa9-0687-4427-942c-d343a3f0c5f4" containerName="mariadb-account-create-update" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.615701 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a18aa9-0687-4427-942c-d343a3f0c5f4" containerName="mariadb-account-create-update" Dec 02 09:46:05 crc kubenswrapper[4781]: E1202 09:46:05.615710 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9416e5ad-d0d8-404a-8fe8-8a5648030542" containerName="mariadb-account-create-update" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.615716 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9416e5ad-d0d8-404a-8fe8-8a5648030542" containerName="mariadb-account-create-update" Dec 02 09:46:05 crc kubenswrapper[4781]: E1202 09:46:05.615724 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a89db488-2448-4a65-82ee-08472553ad38" containerName="mariadb-database-create" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.615731 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89db488-2448-4a65-82ee-08472553ad38" containerName="mariadb-database-create" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.615986 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a44acc8-6e3c-4643-9b69-34f54b53c188" containerName="glance-log" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.616012 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="74200d1e-84a7-4f23-b858-e24dcb955dbb" containerName="dnsmasq-dns" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.616023 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9416e5ad-d0d8-404a-8fe8-8a5648030542" containerName="mariadb-account-create-update" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.616031 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a18aa9-0687-4427-942c-d343a3f0c5f4" containerName="mariadb-account-create-update" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.616049 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1bda83d-ff07-4b0c-a978-a22373590d0f" containerName="mariadb-account-create-update" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.616063 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a44acc8-6e3c-4643-9b69-34f54b53c188" containerName="glance-httpd" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.616078 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a89db488-2448-4a65-82ee-08472553ad38" containerName="mariadb-database-create" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.617208 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.619179 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.619406 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.633095 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.674547 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e057772-e9bc-4fce-90c9-be91978362fe-logs\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.674592 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e057772-e9bc-4fce-90c9-be91978362fe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.674615 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e057772-e9bc-4fce-90c9-be91978362fe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.674681 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e057772-e9bc-4fce-90c9-be91978362fe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.674742 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.675156 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e057772-e9bc-4fce-90c9-be91978362fe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.675240 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e057772-e9bc-4fce-90c9-be91978362fe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.675267 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qd7v\" (UniqueName: \"kubernetes.io/projected/5e057772-e9bc-4fce-90c9-be91978362fe-kube-api-access-7qd7v\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.777017 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e057772-e9bc-4fce-90c9-be91978362fe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.777107 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.777142 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e057772-e9bc-4fce-90c9-be91978362fe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.777193 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e057772-e9bc-4fce-90c9-be91978362fe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.777214 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qd7v\" (UniqueName: \"kubernetes.io/projected/5e057772-e9bc-4fce-90c9-be91978362fe-kube-api-access-7qd7v\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.777234 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e057772-e9bc-4fce-90c9-be91978362fe-logs\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.777261 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e057772-e9bc-4fce-90c9-be91978362fe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.777277 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e057772-e9bc-4fce-90c9-be91978362fe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.778451 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.784019 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e057772-e9bc-4fce-90c9-be91978362fe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.786328 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e057772-e9bc-4fce-90c9-be91978362fe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.786647 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e057772-e9bc-4fce-90c9-be91978362fe-logs\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.787393 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e057772-e9bc-4fce-90c9-be91978362fe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.789853 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e057772-e9bc-4fce-90c9-be91978362fe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.803634 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e057772-e9bc-4fce-90c9-be91978362fe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.823595 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qd7v\" (UniqueName: \"kubernetes.io/projected/5e057772-e9bc-4fce-90c9-be91978362fe-kube-api-access-7qd7v\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.874259 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"5e057772-e9bc-4fce-90c9-be91978362fe\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.965745 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:46:05 crc kubenswrapper[4781]: I1202 09:46:05.972013 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 09:46:06 crc kubenswrapper[4781]: I1202 09:46:06.420083 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh57t" event={"ID":"5a492eac-186c-4c73-9f70-59f989ea3169","Type":"ContainerStarted","Data":"e5a83d8263639d2c65728dc121187c09b8e961fbd4c07829ea9aa572c9f49a62"} Dec 02 09:46:06 crc kubenswrapper[4781]: I1202 09:46:06.738985 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:46:06 crc kubenswrapper[4781]: W1202 09:46:06.749243 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e057772_e9bc_4fce_90c9_be91978362fe.slice/crio-0f5923f30c23e3c485a7cec3cf04342b371188b903d4fad2cf6a4e9bf276e0e8 WatchSource:0}: Error finding container 0f5923f30c23e3c485a7cec3cf04342b371188b903d4fad2cf6a4e9bf276e0e8: Status 404 returned error can't find the container with id 0f5923f30c23e3c485a7cec3cf04342b371188b903d4fad2cf6a4e9bf276e0e8 Dec 02 09:46:07 crc kubenswrapper[4781]: I1202 09:46:07.429447 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5e057772-e9bc-4fce-90c9-be91978362fe","Type":"ContainerStarted","Data":"0f5923f30c23e3c485a7cec3cf04342b371188b903d4fad2cf6a4e9bf276e0e8"} Dec 02 09:46:07 crc kubenswrapper[4781]: I1202 09:46:07.728514 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74200d1e-84a7-4f23-b858-e24dcb955dbb" path="/var/lib/kubelet/pods/74200d1e-84a7-4f23-b858-e24dcb955dbb/volumes" Dec 02 09:46:07 crc kubenswrapper[4781]: I1202 09:46:07.730375 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a44acc8-6e3c-4643-9b69-34f54b53c188" path="/var/lib/kubelet/pods/8a44acc8-6e3c-4643-9b69-34f54b53c188/volumes" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.331373 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-972b4"] Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.333037 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-972b4" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.339362 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-n5wrr" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.339691 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.340019 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.374154 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-972b4"] Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.444836 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-config-data\") pod \"nova-cell0-conductor-db-sync-972b4\" (UID: \"41de14f6-deed-478b-9d75-ad94ab88ee05\") " pod="openstack/nova-cell0-conductor-db-sync-972b4" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.444898 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-scripts\") pod \"nova-cell0-conductor-db-sync-972b4\" (UID: \"41de14f6-deed-478b-9d75-ad94ab88ee05\") " pod="openstack/nova-cell0-conductor-db-sync-972b4" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.444962 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-972b4\" (UID: \"41de14f6-deed-478b-9d75-ad94ab88ee05\") " pod="openstack/nova-cell0-conductor-db-sync-972b4" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.444998 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76sqz\" (UniqueName: \"kubernetes.io/projected/41de14f6-deed-478b-9d75-ad94ab88ee05-kube-api-access-76sqz\") pod \"nova-cell0-conductor-db-sync-972b4\" (UID: \"41de14f6-deed-478b-9d75-ad94ab88ee05\") " pod="openstack/nova-cell0-conductor-db-sync-972b4" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.447682 4781 generic.go:334] "Generic (PLEG): container finished" podID="5a492eac-186c-4c73-9f70-59f989ea3169" containerID="e5a83d8263639d2c65728dc121187c09b8e961fbd4c07829ea9aa572c9f49a62" exitCode=0 Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.447738 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh57t" event={"ID":"5a492eac-186c-4c73-9f70-59f989ea3169","Type":"ContainerDied","Data":"e5a83d8263639d2c65728dc121187c09b8e961fbd4c07829ea9aa572c9f49a62"} Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.455283 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34","Type":"ContainerStarted","Data":"d3c7b00f146be6406a39dfdfdfdb67c9c1d8ceac482b01cf567906dea2524e81"} Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.462613 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5e057772-e9bc-4fce-90c9-be91978362fe","Type":"ContainerStarted","Data":"c9d4c93584df956d623cd6047d5bdf1c1f4cb90fa689d18289a15f01fca76782"} Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.505656 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.505635262 podStartE2EDuration="6.505635262s" podCreationTimestamp="2025-12-02 09:46:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:46:08.491554957 +0000 UTC m=+1531.315428836" watchObservedRunningTime="2025-12-02 09:46:08.505635262 +0000 UTC m=+1531.329509151" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.546684 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-config-data\") pod \"nova-cell0-conductor-db-sync-972b4\" (UID: \"41de14f6-deed-478b-9d75-ad94ab88ee05\") " pod="openstack/nova-cell0-conductor-db-sync-972b4" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.547821 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-scripts\") pod \"nova-cell0-conductor-db-sync-972b4\" (UID: \"41de14f6-deed-478b-9d75-ad94ab88ee05\") " pod="openstack/nova-cell0-conductor-db-sync-972b4" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.548344 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-972b4\" (UID: \"41de14f6-deed-478b-9d75-ad94ab88ee05\") " pod="openstack/nova-cell0-conductor-db-sync-972b4" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.548401 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76sqz\" (UniqueName: \"kubernetes.io/projected/41de14f6-deed-478b-9d75-ad94ab88ee05-kube-api-access-76sqz\") pod \"nova-cell0-conductor-db-sync-972b4\" (UID: \"41de14f6-deed-478b-9d75-ad94ab88ee05\") " pod="openstack/nova-cell0-conductor-db-sync-972b4" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.553195 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-scripts\") pod \"nova-cell0-conductor-db-sync-972b4\" (UID: \"41de14f6-deed-478b-9d75-ad94ab88ee05\") " pod="openstack/nova-cell0-conductor-db-sync-972b4" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.554736 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-972b4\" (UID: \"41de14f6-deed-478b-9d75-ad94ab88ee05\") " pod="openstack/nova-cell0-conductor-db-sync-972b4" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.567360 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76sqz\" (UniqueName: \"kubernetes.io/projected/41de14f6-deed-478b-9d75-ad94ab88ee05-kube-api-access-76sqz\") pod \"nova-cell0-conductor-db-sync-972b4\" (UID: \"41de14f6-deed-478b-9d75-ad94ab88ee05\") " pod="openstack/nova-cell0-conductor-db-sync-972b4" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.575532 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-config-data\") pod \"nova-cell0-conductor-db-sync-972b4\" (UID: \"41de14f6-deed-478b-9d75-ad94ab88ee05\") " pod="openstack/nova-cell0-conductor-db-sync-972b4" Dec 02 09:46:08 crc kubenswrapper[4781]: I1202 09:46:08.662709 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-972b4" Dec 02 09:46:09 crc kubenswrapper[4781]: W1202 09:46:09.219189 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41de14f6_deed_478b_9d75_ad94ab88ee05.slice/crio-d9c49c5f80b52e6242da4338ac28cd670a85d5ddbf4b66ec818f0dd84eb46fd9 WatchSource:0}: Error finding container d9c49c5f80b52e6242da4338ac28cd670a85d5ddbf4b66ec818f0dd84eb46fd9: Status 404 returned error can't find the container with id d9c49c5f80b52e6242da4338ac28cd670a85d5ddbf4b66ec818f0dd84eb46fd9 Dec 02 09:46:09 crc kubenswrapper[4781]: I1202 09:46:09.272548 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-972b4"] Dec 02 09:46:09 crc kubenswrapper[4781]: I1202 09:46:09.473045 4781 generic.go:334] "Generic (PLEG): container finished" podID="2927c2fe-9023-4be6-a44a-6701f24c499a" containerID="b96f5c817e9dd75d8e2b5ac9940d95cebe28fb7655e48cf280ad05627084a444" exitCode=0 Dec 02 09:46:09 crc kubenswrapper[4781]: I1202 09:46:09.473086 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2927c2fe-9023-4be6-a44a-6701f24c499a","Type":"ContainerDied","Data":"b96f5c817e9dd75d8e2b5ac9940d95cebe28fb7655e48cf280ad05627084a444"} Dec 02 09:46:09 crc kubenswrapper[4781]: I1202 09:46:09.474362 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-972b4" event={"ID":"41de14f6-deed-478b-9d75-ad94ab88ee05","Type":"ContainerStarted","Data":"d9c49c5f80b52e6242da4338ac28cd670a85d5ddbf4b66ec818f0dd84eb46fd9"} Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.479666 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.487720 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh57t" event={"ID":"5a492eac-186c-4c73-9f70-59f989ea3169","Type":"ContainerStarted","Data":"f0f82bf597683b60ccb48e90f5d5ce15b005ad488a72ad27ec9dd0545da1c60c"} Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.490879 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5e057772-e9bc-4fce-90c9-be91978362fe","Type":"ContainerStarted","Data":"cdfeb9803421d42153f82558053ee0699242a35f2fcc8cd96ed1d310ff8edd96"} Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.493088 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2927c2fe-9023-4be6-a44a-6701f24c499a","Type":"ContainerDied","Data":"6c44baf29ed84225970b39a0e80b27f597e83f878d44440311ddad541b405c0d"} Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.493126 4781 scope.go:117] "RemoveContainer" containerID="271eaafcaef3b1e096ae33b029d7ad759e5eca9f1a9a61fd5beeda21269a5651" Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.493241 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.532547 4781 scope.go:117] "RemoveContainer" containerID="b96f5c817e9dd75d8e2b5ac9940d95cebe28fb7655e48cf280ad05627084a444" Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.534446 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.534434125 podStartE2EDuration="5.534434125s" podCreationTimestamp="2025-12-02 09:46:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:46:10.533040148 +0000 UTC m=+1533.356914037" watchObservedRunningTime="2025-12-02 09:46:10.534434125 +0000 UTC m=+1533.358308014" Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.572897 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zh57t" podStartSLOduration=4.819305972 podStartE2EDuration="10.572881238s" podCreationTimestamp="2025-12-02 09:46:00 +0000 UTC" firstStartedPulling="2025-12-02 09:46:04.327665305 +0000 UTC m=+1527.151539184" lastFinishedPulling="2025-12-02 09:46:10.081240571 +0000 UTC m=+1532.905114450" observedRunningTime="2025-12-02 09:46:10.572617951 +0000 UTC m=+1533.396491830" watchObservedRunningTime="2025-12-02 09:46:10.572881238 +0000 UTC m=+1533.396755107" Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.612558 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-combined-ca-bundle\") pod \"2927c2fe-9023-4be6-a44a-6701f24c499a\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.612831 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2927c2fe-9023-4be6-a44a-6701f24c499a-etc-machine-id\") pod \"2927c2fe-9023-4be6-a44a-6701f24c499a\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.612865 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-config-data-custom\") pod \"2927c2fe-9023-4be6-a44a-6701f24c499a\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.612914 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrsc7\" (UniqueName: \"kubernetes.io/projected/2927c2fe-9023-4be6-a44a-6701f24c499a-kube-api-access-zrsc7\") pod \"2927c2fe-9023-4be6-a44a-6701f24c499a\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.613046 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-config-data\") pod \"2927c2fe-9023-4be6-a44a-6701f24c499a\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.613083 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-scripts\") pod \"2927c2fe-9023-4be6-a44a-6701f24c499a\" (UID: \"2927c2fe-9023-4be6-a44a-6701f24c499a\") " Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.615528 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2927c2fe-9023-4be6-a44a-6701f24c499a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2927c2fe-9023-4be6-a44a-6701f24c499a" (UID: "2927c2fe-9023-4be6-a44a-6701f24c499a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.621621 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2927c2fe-9023-4be6-a44a-6701f24c499a" (UID: "2927c2fe-9023-4be6-a44a-6701f24c499a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.623025 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-scripts" (OuterVolumeSpecName: "scripts") pod "2927c2fe-9023-4be6-a44a-6701f24c499a" (UID: "2927c2fe-9023-4be6-a44a-6701f24c499a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.624560 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2927c2fe-9023-4be6-a44a-6701f24c499a-kube-api-access-zrsc7" (OuterVolumeSpecName: "kube-api-access-zrsc7") pod "2927c2fe-9023-4be6-a44a-6701f24c499a" (UID: "2927c2fe-9023-4be6-a44a-6701f24c499a"). InnerVolumeSpecName "kube-api-access-zrsc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.715083 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2927c2fe-9023-4be6-a44a-6701f24c499a" (UID: "2927c2fe-9023-4be6-a44a-6701f24c499a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.715564 4781 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2927c2fe-9023-4be6-a44a-6701f24c499a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.715601 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.715614 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrsc7\" (UniqueName: \"kubernetes.io/projected/2927c2fe-9023-4be6-a44a-6701f24c499a-kube-api-access-zrsc7\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.715631 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.715642 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.799053 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-config-data" (OuterVolumeSpecName: "config-data") pod "2927c2fe-9023-4be6-a44a-6701f24c499a" (UID: "2927c2fe-9023-4be6-a44a-6701f24c499a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:10 crc kubenswrapper[4781]: I1202 09:46:10.817397 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2927c2fe-9023-4be6-a44a-6701f24c499a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.143188 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.163352 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.169784 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 09:46:11 crc kubenswrapper[4781]: E1202 09:46:11.170222 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2927c2fe-9023-4be6-a44a-6701f24c499a" containerName="cinder-scheduler" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.170241 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2927c2fe-9023-4be6-a44a-6701f24c499a" containerName="cinder-scheduler" Dec 02 09:46:11 crc kubenswrapper[4781]: E1202 09:46:11.170250 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2927c2fe-9023-4be6-a44a-6701f24c499a" containerName="probe" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.170256 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2927c2fe-9023-4be6-a44a-6701f24c499a" containerName="probe" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.170534 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2927c2fe-9023-4be6-a44a-6701f24c499a" containerName="probe" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.170552 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2927c2fe-9023-4be6-a44a-6701f24c499a" containerName="cinder-scheduler" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.171647 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.177004 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.192453 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.226314 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8457fc90-04ad-45ed-b898-ddf4d7b645b4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.226348 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8457fc90-04ad-45ed-b898-ddf4d7b645b4-scripts\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.226420 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8457fc90-04ad-45ed-b898-ddf4d7b645b4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.226440 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2ctm\" (UniqueName: \"kubernetes.io/projected/8457fc90-04ad-45ed-b898-ddf4d7b645b4-kube-api-access-w2ctm\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.226660 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8457fc90-04ad-45ed-b898-ddf4d7b645b4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.226828 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8457fc90-04ad-45ed-b898-ddf4d7b645b4-config-data\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.248608 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zh57t" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.250987 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zh57t" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.328767 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8457fc90-04ad-45ed-b898-ddf4d7b645b4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.328848 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8457fc90-04ad-45ed-b898-ddf4d7b645b4-config-data\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.329006 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8457fc90-04ad-45ed-b898-ddf4d7b645b4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.329030 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8457fc90-04ad-45ed-b898-ddf4d7b645b4-scripts\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.329074 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8457fc90-04ad-45ed-b898-ddf4d7b645b4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.329104 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2ctm\" (UniqueName: \"kubernetes.io/projected/8457fc90-04ad-45ed-b898-ddf4d7b645b4-kube-api-access-w2ctm\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.329430 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8457fc90-04ad-45ed-b898-ddf4d7b645b4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.335978 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8457fc90-04ad-45ed-b898-ddf4d7b645b4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.336731 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8457fc90-04ad-45ed-b898-ddf4d7b645b4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.338273 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8457fc90-04ad-45ed-b898-ddf4d7b645b4-scripts\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.346864 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2ctm\" (UniqueName: \"kubernetes.io/projected/8457fc90-04ad-45ed-b898-ddf4d7b645b4-kube-api-access-w2ctm\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.352460 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8457fc90-04ad-45ed-b898-ddf4d7b645b4-config-data\") pod \"cinder-scheduler-0\" (UID: \"8457fc90-04ad-45ed-b898-ddf4d7b645b4\") " pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.518339 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2927c2fe-9023-4be6-a44a-6701f24c499a" path="/var/lib/kubelet/pods/2927c2fe-9023-4be6-a44a-6701f24c499a/volumes" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.550646 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 09:46:11 crc kubenswrapper[4781]: I1202 09:46:11.596424 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.331848 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zh57t" podUID="5a492eac-186c-4c73-9f70-59f989ea3169" containerName="registry-server" probeResult="failure" output=< Dec 02 09:46:12 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Dec 02 09:46:12 crc kubenswrapper[4781]: > Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.420672 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.561497 4781 generic.go:334] "Generic (PLEG): container finished" podID="e1708d43-f8b6-491b-a365-a8542137dd44" containerID="431de54c5735d6c4333d70232eb87c7cbfc2630347bd6580c881a49ad249c0ba" exitCode=137 Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.561596 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1708d43-f8b6-491b-a365-a8542137dd44","Type":"ContainerDied","Data":"431de54c5735d6c4333d70232eb87c7cbfc2630347bd6580c881a49ad249c0ba"} Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.575619 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8457fc90-04ad-45ed-b898-ddf4d7b645b4","Type":"ContainerStarted","Data":"5ad7fd5b9cebcd93f92245102af38f648e0c352f441fe5e6af3d430a802bbae3"} Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.658497 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.769969 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-sg-core-conf-yaml\") pod \"e1708d43-f8b6-491b-a365-a8542137dd44\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.770068 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfwzk\" (UniqueName: \"kubernetes.io/projected/e1708d43-f8b6-491b-a365-a8542137dd44-kube-api-access-bfwzk\") pod \"e1708d43-f8b6-491b-a365-a8542137dd44\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.770247 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1708d43-f8b6-491b-a365-a8542137dd44-log-httpd\") pod \"e1708d43-f8b6-491b-a365-a8542137dd44\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.770277 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-config-data\") pod \"e1708d43-f8b6-491b-a365-a8542137dd44\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.770338 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-combined-ca-bundle\") pod \"e1708d43-f8b6-491b-a365-a8542137dd44\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.770396 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1708d43-f8b6-491b-a365-a8542137dd44-run-httpd\") pod \"e1708d43-f8b6-491b-a365-a8542137dd44\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.770467 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-scripts\") pod \"e1708d43-f8b6-491b-a365-a8542137dd44\" (UID: \"e1708d43-f8b6-491b-a365-a8542137dd44\") " Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.772662 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1708d43-f8b6-491b-a365-a8542137dd44-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e1708d43-f8b6-491b-a365-a8542137dd44" (UID: "e1708d43-f8b6-491b-a365-a8542137dd44"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.773483 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1708d43-f8b6-491b-a365-a8542137dd44-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e1708d43-f8b6-491b-a365-a8542137dd44" (UID: "e1708d43-f8b6-491b-a365-a8542137dd44"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.779962 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1708d43-f8b6-491b-a365-a8542137dd44-kube-api-access-bfwzk" (OuterVolumeSpecName: "kube-api-access-bfwzk") pod "e1708d43-f8b6-491b-a365-a8542137dd44" (UID: "e1708d43-f8b6-491b-a365-a8542137dd44"). InnerVolumeSpecName "kube-api-access-bfwzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.790096 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-scripts" (OuterVolumeSpecName: "scripts") pod "e1708d43-f8b6-491b-a365-a8542137dd44" (UID: "e1708d43-f8b6-491b-a365-a8542137dd44"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.845675 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e1708d43-f8b6-491b-a365-a8542137dd44" (UID: "e1708d43-f8b6-491b-a365-a8542137dd44"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.874308 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1708d43-f8b6-491b-a365-a8542137dd44-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.874340 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1708d43-f8b6-491b-a365-a8542137dd44-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.874348 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.874358 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.874368 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfwzk\" (UniqueName: \"kubernetes.io/projected/e1708d43-f8b6-491b-a365-a8542137dd44-kube-api-access-bfwzk\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.896650 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1708d43-f8b6-491b-a365-a8542137dd44" (UID: "e1708d43-f8b6-491b-a365-a8542137dd44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.909676 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.909736 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.927100 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-config-data" (OuterVolumeSpecName: "config-data") pod "e1708d43-f8b6-491b-a365-a8542137dd44" (UID: "e1708d43-f8b6-491b-a365-a8542137dd44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.943258 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.976166 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:12 crc kubenswrapper[4781]: I1202 09:46:12.976197 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1708d43-f8b6-491b-a365-a8542137dd44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.013687 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.605948 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8457fc90-04ad-45ed-b898-ddf4d7b645b4","Type":"ContainerStarted","Data":"75bf3466df2a9246b0653ef1a0717d37aa667ae3a440e9c227dc7cbbf2187b70"} Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.624058 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.625348 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1708d43-f8b6-491b-a365-a8542137dd44","Type":"ContainerDied","Data":"64e45df7ff96e97983a336361c74ddb4fb41878cecf8133caf8370036da8e589"} Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.625413 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.625437 4781 scope.go:117] "RemoveContainer" containerID="431de54c5735d6c4333d70232eb87c7cbfc2630347bd6580c881a49ad249c0ba" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.625816 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.682034 4781 scope.go:117] "RemoveContainer" containerID="6983d16fa5c64f540318073c4f04f369f98ac3334fdf57fae6821d28b5547935" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.696076 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.740995 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.787525 4781 scope.go:117] "RemoveContainer" containerID="e83e542d40461a80472e7eaed39b3591f79be6d61ce515a902e3ca725927583f" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.787683 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:46:13 crc kubenswrapper[4781]: E1202 09:46:13.788163 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1708d43-f8b6-491b-a365-a8542137dd44" containerName="ceilometer-central-agent" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.788179 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1708d43-f8b6-491b-a365-a8542137dd44" containerName="ceilometer-central-agent" Dec 02 09:46:13 crc kubenswrapper[4781]: E1202 09:46:13.788198 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1708d43-f8b6-491b-a365-a8542137dd44" containerName="ceilometer-notification-agent" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.788209 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1708d43-f8b6-491b-a365-a8542137dd44" containerName="ceilometer-notification-agent" Dec 02 09:46:13 crc kubenswrapper[4781]: E1202 09:46:13.788248 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1708d43-f8b6-491b-a365-a8542137dd44" containerName="proxy-httpd" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.788256 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1708d43-f8b6-491b-a365-a8542137dd44" containerName="proxy-httpd" Dec 02 09:46:13 crc kubenswrapper[4781]: E1202 09:46:13.788277 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1708d43-f8b6-491b-a365-a8542137dd44" containerName="sg-core" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.788285 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1708d43-f8b6-491b-a365-a8542137dd44" containerName="sg-core" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.788651 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1708d43-f8b6-491b-a365-a8542137dd44" containerName="ceilometer-notification-agent" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.788669 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1708d43-f8b6-491b-a365-a8542137dd44" containerName="ceilometer-central-agent" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.788687 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1708d43-f8b6-491b-a365-a8542137dd44" containerName="sg-core" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.788706 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1708d43-f8b6-491b-a365-a8542137dd44" containerName="proxy-httpd" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.791283 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.794126 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.794494 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.822269 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.892687 4781 scope.go:117] "RemoveContainer" containerID="5a8ee8bbbb0a8e89c8e39ea90569ff4a1fe26323f322f49fea79635b6c5afb47" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.901720 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.901768 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-scripts\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.901811 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.901834 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-config-data\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.901857 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-run-httpd\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.901884 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdb2s\" (UniqueName: \"kubernetes.io/projected/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-kube-api-access-xdb2s\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:13 crc kubenswrapper[4781]: I1202 09:46:13.901903 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-log-httpd\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:14 crc kubenswrapper[4781]: I1202 09:46:14.005159 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:14 crc kubenswrapper[4781]: I1202 09:46:14.005209 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-scripts\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:14 crc kubenswrapper[4781]: I1202 09:46:14.005263 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:14 crc kubenswrapper[4781]: I1202 09:46:14.005295 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-config-data\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:14 crc kubenswrapper[4781]: I1202 09:46:14.005325 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-run-httpd\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:14 crc kubenswrapper[4781]: I1202 09:46:14.005362 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdb2s\" (UniqueName: \"kubernetes.io/projected/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-kube-api-access-xdb2s\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:14 crc kubenswrapper[4781]: I1202 09:46:14.005387 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-log-httpd\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:14 crc kubenswrapper[4781]: I1202 09:46:14.006129 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-log-httpd\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:14 crc kubenswrapper[4781]: I1202 09:46:14.007396 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-run-httpd\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:14 crc kubenswrapper[4781]: I1202 09:46:14.015770 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:14 crc kubenswrapper[4781]: I1202 09:46:14.016964 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-scripts\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:14 crc kubenswrapper[4781]: I1202 09:46:14.017382 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:14 crc kubenswrapper[4781]: I1202 09:46:14.023802 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-config-data\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:14 crc kubenswrapper[4781]: I1202 09:46:14.033693 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdb2s\" (UniqueName: \"kubernetes.io/projected/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-kube-api-access-xdb2s\") pod \"ceilometer-0\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " pod="openstack/ceilometer-0" Dec 02 09:46:14 crc kubenswrapper[4781]: I1202 09:46:14.127409 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:46:14 crc kubenswrapper[4781]: I1202 09:46:14.906810 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:46:15 crc kubenswrapper[4781]: I1202 09:46:15.100063 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:46:15 crc kubenswrapper[4781]: I1202 09:46:15.514085 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1708d43-f8b6-491b-a365-a8542137dd44" path="/var/lib/kubelet/pods/e1708d43-f8b6-491b-a365-a8542137dd44/volumes" Dec 02 09:46:15 crc kubenswrapper[4781]: I1202 09:46:15.514847 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:15 crc kubenswrapper[4781]: I1202 09:46:15.693828 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cde75d3-6c25-47b6-85fb-d7fac5c2c733","Type":"ContainerStarted","Data":"b08d0644f1b87e5fa659432ff8897cdc12ccdaae23368ae4a24b68e5e3476bd2"} Dec 02 09:46:15 crc kubenswrapper[4781]: I1202 09:46:15.702407 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 09:46:15 crc kubenswrapper[4781]: I1202 09:46:15.702425 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 09:46:15 crc kubenswrapper[4781]: I1202 09:46:15.704151 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8457fc90-04ad-45ed-b898-ddf4d7b645b4","Type":"ContainerStarted","Data":"aa9b9cacb3a2cef79777c726de7a2febe4146d5a26e55c5b9bab29d0b236e898"} Dec 02 09:46:15 crc kubenswrapper[4781]: I1202 09:46:15.726669 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.726650711 podStartE2EDuration="4.726650711s" podCreationTimestamp="2025-12-02 09:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:46:15.722219564 +0000 UTC m=+1538.546093443" watchObservedRunningTime="2025-12-02 09:46:15.726650711 +0000 UTC m=+1538.550524590" Dec 02 09:46:15 crc kubenswrapper[4781]: I1202 09:46:15.803859 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-f677cfc78-nzcxt" Dec 02 09:46:15 crc kubenswrapper[4781]: I1202 09:46:15.897904 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56b57b86b8-kf8n9"] Dec 02 09:46:15 crc kubenswrapper[4781]: I1202 09:46:15.898256 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56b57b86b8-kf8n9" podUID="22badd1d-b000-4870-86cd-1186fe2ed67d" containerName="barbican-api-log" containerID="cri-o://a30683938bcc417eaf9b442f5163f4eddd5b0105f1c21bde059fbe53fc627c5e" gracePeriod=30 Dec 02 09:46:15 crc kubenswrapper[4781]: I1202 09:46:15.898396 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56b57b86b8-kf8n9" podUID="22badd1d-b000-4870-86cd-1186fe2ed67d" containerName="barbican-api" containerID="cri-o://ffba489bcf4c5dc5f768fc71cdf74d43834db93c6d3619c4e7b163a911b91cbe" gracePeriod=30 Dec 02 09:46:15 crc kubenswrapper[4781]: I1202 09:46:15.972675 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 09:46:15 crc kubenswrapper[4781]: I1202 09:46:15.972731 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 09:46:16 crc kubenswrapper[4781]: I1202 09:46:16.102375 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 09:46:16 crc kubenswrapper[4781]: I1202 09:46:16.102770 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 09:46:16 crc kubenswrapper[4781]: I1202 09:46:16.551620 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 09:46:16 crc kubenswrapper[4781]: I1202 09:46:16.735693 4781 generic.go:334] "Generic (PLEG): container finished" podID="22badd1d-b000-4870-86cd-1186fe2ed67d" containerID="a30683938bcc417eaf9b442f5163f4eddd5b0105f1c21bde059fbe53fc627c5e" exitCode=143 Dec 02 09:46:16 crc kubenswrapper[4781]: I1202 09:46:16.735789 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b57b86b8-kf8n9" event={"ID":"22badd1d-b000-4870-86cd-1186fe2ed67d","Type":"ContainerDied","Data":"a30683938bcc417eaf9b442f5163f4eddd5b0105f1c21bde059fbe53fc627c5e"} Dec 02 09:46:16 crc kubenswrapper[4781]: I1202 09:46:16.743941 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cde75d3-6c25-47b6-85fb-d7fac5c2c733","Type":"ContainerStarted","Data":"ccdb0ef00d1365df7aa818ee92651d0d59e974c7ddba4cfeac175719982d401c"} Dec 02 09:46:16 crc kubenswrapper[4781]: I1202 09:46:16.744239 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 09:46:16 crc kubenswrapper[4781]: I1202 09:46:16.744313 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 09:46:17 crc kubenswrapper[4781]: I1202 09:46:17.418597 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 09:46:17 crc kubenswrapper[4781]: I1202 09:46:17.418687 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 09:46:17 crc kubenswrapper[4781]: I1202 09:46:17.428807 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 09:46:18 crc kubenswrapper[4781]: I1202 09:46:18.798080 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cde75d3-6c25-47b6-85fb-d7fac5c2c733","Type":"ContainerStarted","Data":"ed94d4b6eed37ef38b1e28977da53ed4a9c5cb0d7c8cebce08e4348832056e87"} Dec 02 09:46:19 crc kubenswrapper[4781]: I1202 09:46:19.424566 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56b57b86b8-kf8n9" podUID="22badd1d-b000-4870-86cd-1186fe2ed67d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:37558->10.217.0.161:9311: read: connection reset by peer" Dec 02 09:46:19 crc kubenswrapper[4781]: I1202 09:46:19.424616 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56b57b86b8-kf8n9" podUID="22badd1d-b000-4870-86cd-1186fe2ed67d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:37548->10.217.0.161:9311: read: connection reset by peer" Dec 02 09:46:19 crc kubenswrapper[4781]: I1202 09:46:19.813124 4781 generic.go:334] "Generic (PLEG): container finished" podID="22badd1d-b000-4870-86cd-1186fe2ed67d" containerID="ffba489bcf4c5dc5f768fc71cdf74d43834db93c6d3619c4e7b163a911b91cbe" exitCode=0 Dec 02 09:46:19 crc kubenswrapper[4781]: I1202 09:46:19.813188 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b57b86b8-kf8n9" event={"ID":"22badd1d-b000-4870-86cd-1186fe2ed67d","Type":"ContainerDied","Data":"ffba489bcf4c5dc5f768fc71cdf74d43834db93c6d3619c4e7b163a911b91cbe"} Dec 02 09:46:19 crc kubenswrapper[4781]: I1202 09:46:19.945987 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.043820 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.043933 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.047841 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.104994 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw5b2\" (UniqueName: \"kubernetes.io/projected/22badd1d-b000-4870-86cd-1186fe2ed67d-kube-api-access-gw5b2\") pod \"22badd1d-b000-4870-86cd-1186fe2ed67d\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.105084 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22badd1d-b000-4870-86cd-1186fe2ed67d-logs\") pod \"22badd1d-b000-4870-86cd-1186fe2ed67d\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.105170 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-config-data\") pod \"22badd1d-b000-4870-86cd-1186fe2ed67d\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.105333 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-combined-ca-bundle\") pod \"22badd1d-b000-4870-86cd-1186fe2ed67d\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.105522 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-config-data-custom\") pod \"22badd1d-b000-4870-86cd-1186fe2ed67d\" (UID: \"22badd1d-b000-4870-86cd-1186fe2ed67d\") " Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.113988 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22badd1d-b000-4870-86cd-1186fe2ed67d-logs" (OuterVolumeSpecName: "logs") pod "22badd1d-b000-4870-86cd-1186fe2ed67d" (UID: "22badd1d-b000-4870-86cd-1186fe2ed67d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.123424 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "22badd1d-b000-4870-86cd-1186fe2ed67d" (UID: "22badd1d-b000-4870-86cd-1186fe2ed67d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.123863 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22badd1d-b000-4870-86cd-1186fe2ed67d-kube-api-access-gw5b2" (OuterVolumeSpecName: "kube-api-access-gw5b2") pod "22badd1d-b000-4870-86cd-1186fe2ed67d" (UID: "22badd1d-b000-4870-86cd-1186fe2ed67d"). InnerVolumeSpecName "kube-api-access-gw5b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.190481 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22badd1d-b000-4870-86cd-1186fe2ed67d" (UID: "22badd1d-b000-4870-86cd-1186fe2ed67d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.219513 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-config-data" (OuterVolumeSpecName: "config-data") pod "22badd1d-b000-4870-86cd-1186fe2ed67d" (UID: "22badd1d-b000-4870-86cd-1186fe2ed67d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.220300 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.220328 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw5b2\" (UniqueName: \"kubernetes.io/projected/22badd1d-b000-4870-86cd-1186fe2ed67d-kube-api-access-gw5b2\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.220340 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22badd1d-b000-4870-86cd-1186fe2ed67d-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.220349 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.220357 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22badd1d-b000-4870-86cd-1186fe2ed67d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.848562 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56b57b86b8-kf8n9" Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.848691 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b57b86b8-kf8n9" event={"ID":"22badd1d-b000-4870-86cd-1186fe2ed67d","Type":"ContainerDied","Data":"69ee68c78ba6cbebf19b9143ef657384c5681395dafd1bd3dd6036a866728f06"} Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.849427 4781 scope.go:117] "RemoveContainer" containerID="ffba489bcf4c5dc5f768fc71cdf74d43834db93c6d3619c4e7b163a911b91cbe" Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.881897 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56b57b86b8-kf8n9"] Dec 02 09:46:20 crc kubenswrapper[4781]: I1202 09:46:20.900527 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-56b57b86b8-kf8n9"] Dec 02 09:46:21 crc kubenswrapper[4781]: I1202 09:46:21.536099 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22badd1d-b000-4870-86cd-1186fe2ed67d" path="/var/lib/kubelet/pods/22badd1d-b000-4870-86cd-1186fe2ed67d/volumes" Dec 02 09:46:21 crc kubenswrapper[4781]: I1202 09:46:21.843374 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 09:46:22 crc kubenswrapper[4781]: I1202 09:46:22.320453 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zh57t" podUID="5a492eac-186c-4c73-9f70-59f989ea3169" containerName="registry-server" probeResult="failure" output=< Dec 02 09:46:22 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Dec 02 09:46:22 crc kubenswrapper[4781]: > Dec 02 09:46:23 crc kubenswrapper[4781]: I1202 09:46:23.047495 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:46:27 crc kubenswrapper[4781]: I1202 09:46:27.356967 4781 scope.go:117] "RemoveContainer" containerID="a30683938bcc417eaf9b442f5163f4eddd5b0105f1c21bde059fbe53fc627c5e" Dec 02 09:46:28 crc kubenswrapper[4781]: I1202 09:46:28.932981 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cde75d3-6c25-47b6-85fb-d7fac5c2c733","Type":"ContainerStarted","Data":"e60cebbe7bffe5c7915134857beeb5fb900a6aa851b6bfe45b50e87ba223c1e1"} Dec 02 09:46:28 crc kubenswrapper[4781]: I1202 09:46:28.934942 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-972b4" event={"ID":"41de14f6-deed-478b-9d75-ad94ab88ee05","Type":"ContainerStarted","Data":"65ce496472d4caa27e5c6be55c9ff97db07abac06c00e5140f86690226b00b55"} Dec 02 09:46:28 crc kubenswrapper[4781]: I1202 09:46:28.950051 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-972b4" podStartSLOduration=2.479284589 podStartE2EDuration="20.950036104s" podCreationTimestamp="2025-12-02 09:46:08 +0000 UTC" firstStartedPulling="2025-12-02 09:46:09.254196903 +0000 UTC m=+1532.078070792" lastFinishedPulling="2025-12-02 09:46:27.724948428 +0000 UTC m=+1550.548822307" observedRunningTime="2025-12-02 09:46:28.947581948 +0000 UTC m=+1551.771455827" watchObservedRunningTime="2025-12-02 09:46:28.950036104 +0000 UTC m=+1551.773909983" Dec 02 09:46:29 crc kubenswrapper[4781]: I1202 09:46:29.027650 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-9cbdc4d89-pkh64" Dec 02 09:46:29 crc kubenswrapper[4781]: I1202 09:46:29.098435 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77d7d4d6c6-nc7g8"] Dec 02 09:46:29 crc kubenswrapper[4781]: I1202 09:46:29.102221 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77d7d4d6c6-nc7g8" podUID="1b2440b7-d6a5-426e-ac38-2f5fd531935b" containerName="neutron-api" containerID="cri-o://82211f6afee1c1af66fd6f8198a3ea506b21910b79cc1625c9807ff23dac2d98" gracePeriod=30 Dec 02 09:46:29 crc kubenswrapper[4781]: I1202 09:46:29.102348 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77d7d4d6c6-nc7g8" podUID="1b2440b7-d6a5-426e-ac38-2f5fd531935b" containerName="neutron-httpd" containerID="cri-o://9053393ea9f95fe2fac898b4da31f97b12804d5409b12558ee2677ee81437aec" gracePeriod=30 Dec 02 09:46:29 crc kubenswrapper[4781]: I1202 09:46:29.946107 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cde75d3-6c25-47b6-85fb-d7fac5c2c733","Type":"ContainerStarted","Data":"4fe9aa986504cde63c3dc8abba2424f271483a77966a0cb06b99068002b3740a"} Dec 02 09:46:29 crc kubenswrapper[4781]: I1202 09:46:29.946431 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerName="ceilometer-central-agent" containerID="cri-o://ccdb0ef00d1365df7aa818ee92651d0d59e974c7ddba4cfeac175719982d401c" gracePeriod=30 Dec 02 09:46:29 crc kubenswrapper[4781]: I1202 09:46:29.946510 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 09:46:29 crc kubenswrapper[4781]: I1202 09:46:29.946523 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerName="proxy-httpd" containerID="cri-o://4fe9aa986504cde63c3dc8abba2424f271483a77966a0cb06b99068002b3740a" gracePeriod=30 Dec 02 09:46:29 crc kubenswrapper[4781]: I1202 09:46:29.946567 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerName="sg-core" containerID="cri-o://e60cebbe7bffe5c7915134857beeb5fb900a6aa851b6bfe45b50e87ba223c1e1" gracePeriod=30 Dec 02 09:46:29 crc kubenswrapper[4781]: I1202 09:46:29.946610 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerName="ceilometer-notification-agent" containerID="cri-o://ed94d4b6eed37ef38b1e28977da53ed4a9c5cb0d7c8cebce08e4348832056e87" gracePeriod=30 Dec 02 09:46:29 crc kubenswrapper[4781]: I1202 09:46:29.949693 4781 generic.go:334] "Generic (PLEG): container finished" podID="1b2440b7-d6a5-426e-ac38-2f5fd531935b" containerID="9053393ea9f95fe2fac898b4da31f97b12804d5409b12558ee2677ee81437aec" exitCode=0 Dec 02 09:46:29 crc kubenswrapper[4781]: I1202 09:46:29.950573 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77d7d4d6c6-nc7g8" event={"ID":"1b2440b7-d6a5-426e-ac38-2f5fd531935b","Type":"ContainerDied","Data":"9053393ea9f95fe2fac898b4da31f97b12804d5409b12558ee2677ee81437aec"} Dec 02 09:46:29 crc kubenswrapper[4781]: I1202 09:46:29.973499 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.5745036040000002 podStartE2EDuration="16.973464106s" podCreationTimestamp="2025-12-02 09:46:13 +0000 UTC" firstStartedPulling="2025-12-02 09:46:14.934188523 +0000 UTC m=+1537.758062402" lastFinishedPulling="2025-12-02 09:46:29.333149025 +0000 UTC m=+1552.157022904" observedRunningTime="2025-12-02 09:46:29.971490164 +0000 UTC m=+1552.795364073" watchObservedRunningTime="2025-12-02 09:46:29.973464106 +0000 UTC m=+1552.797337985" Dec 02 09:46:30 crc kubenswrapper[4781]: I1202 09:46:30.963522 4781 generic.go:334] "Generic (PLEG): container finished" podID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerID="4fe9aa986504cde63c3dc8abba2424f271483a77966a0cb06b99068002b3740a" exitCode=0 Dec 02 09:46:30 crc kubenswrapper[4781]: I1202 09:46:30.963569 4781 generic.go:334] "Generic (PLEG): container finished" podID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerID="e60cebbe7bffe5c7915134857beeb5fb900a6aa851b6bfe45b50e87ba223c1e1" exitCode=2 Dec 02 09:46:30 crc kubenswrapper[4781]: I1202 09:46:30.963579 4781 generic.go:334] "Generic (PLEG): container finished" podID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerID="ed94d4b6eed37ef38b1e28977da53ed4a9c5cb0d7c8cebce08e4348832056e87" exitCode=0 Dec 02 09:46:30 crc kubenswrapper[4781]: I1202 09:46:30.963587 4781 generic.go:334] "Generic (PLEG): container finished" podID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerID="ccdb0ef00d1365df7aa818ee92651d0d59e974c7ddba4cfeac175719982d401c" exitCode=0 Dec 02 09:46:30 crc kubenswrapper[4781]: I1202 09:46:30.963599 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cde75d3-6c25-47b6-85fb-d7fac5c2c733","Type":"ContainerDied","Data":"4fe9aa986504cde63c3dc8abba2424f271483a77966a0cb06b99068002b3740a"} Dec 02 09:46:30 crc kubenswrapper[4781]: I1202 09:46:30.963647 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cde75d3-6c25-47b6-85fb-d7fac5c2c733","Type":"ContainerDied","Data":"e60cebbe7bffe5c7915134857beeb5fb900a6aa851b6bfe45b50e87ba223c1e1"} Dec 02 09:46:30 crc kubenswrapper[4781]: I1202 09:46:30.963661 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cde75d3-6c25-47b6-85fb-d7fac5c2c733","Type":"ContainerDied","Data":"ed94d4b6eed37ef38b1e28977da53ed4a9c5cb0d7c8cebce08e4348832056e87"} Dec 02 09:46:30 crc kubenswrapper[4781]: I1202 09:46:30.963672 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cde75d3-6c25-47b6-85fb-d7fac5c2c733","Type":"ContainerDied","Data":"ccdb0ef00d1365df7aa818ee92651d0d59e974c7ddba4cfeac175719982d401c"} Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.296344 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zh57t" Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.348804 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zh57t" Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.748714 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.869535 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-config-data\") pod \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.869680 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-sg-core-conf-yaml\") pod \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.869734 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-combined-ca-bundle\") pod \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.869791 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-log-httpd\") pod \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.869866 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdb2s\" (UniqueName: \"kubernetes.io/projected/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-kube-api-access-xdb2s\") pod \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.869895 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-scripts\") pod \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.869947 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-run-httpd\") pod \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\" (UID: \"9cde75d3-6c25-47b6-85fb-d7fac5c2c733\") " Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.870653 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9cde75d3-6c25-47b6-85fb-d7fac5c2c733" (UID: "9cde75d3-6c25-47b6-85fb-d7fac5c2c733"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.870902 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9cde75d3-6c25-47b6-85fb-d7fac5c2c733" (UID: "9cde75d3-6c25-47b6-85fb-d7fac5c2c733"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.876438 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-kube-api-access-xdb2s" (OuterVolumeSpecName: "kube-api-access-xdb2s") pod "9cde75d3-6c25-47b6-85fb-d7fac5c2c733" (UID: "9cde75d3-6c25-47b6-85fb-d7fac5c2c733"). InnerVolumeSpecName "kube-api-access-xdb2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.876550 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-scripts" (OuterVolumeSpecName: "scripts") pod "9cde75d3-6c25-47b6-85fb-d7fac5c2c733" (UID: "9cde75d3-6c25-47b6-85fb-d7fac5c2c733"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.905094 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9cde75d3-6c25-47b6-85fb-d7fac5c2c733" (UID: "9cde75d3-6c25-47b6-85fb-d7fac5c2c733"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.908377 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zh57t"] Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.974058 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.974097 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.974109 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdb2s\" (UniqueName: \"kubernetes.io/projected/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-kube-api-access-xdb2s\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.974121 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.974133 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.974467 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cde75d3-6c25-47b6-85fb-d7fac5c2c733" (UID: "9cde75d3-6c25-47b6-85fb-d7fac5c2c733"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.979201 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.979201 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cde75d3-6c25-47b6-85fb-d7fac5c2c733","Type":"ContainerDied","Data":"b08d0644f1b87e5fa659432ff8897cdc12ccdaae23368ae4a24b68e5e3476bd2"} Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.979465 4781 scope.go:117] "RemoveContainer" containerID="4fe9aa986504cde63c3dc8abba2424f271483a77966a0cb06b99068002b3740a" Dec 02 09:46:31 crc kubenswrapper[4781]: I1202 09:46:31.994133 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-config-data" (OuterVolumeSpecName: "config-data") pod "9cde75d3-6c25-47b6-85fb-d7fac5c2c733" (UID: "9cde75d3-6c25-47b6-85fb-d7fac5c2c733"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.036791 4781 scope.go:117] "RemoveContainer" containerID="e60cebbe7bffe5c7915134857beeb5fb900a6aa851b6bfe45b50e87ba223c1e1" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.058330 4781 scope.go:117] "RemoveContainer" containerID="ed94d4b6eed37ef38b1e28977da53ed4a9c5cb0d7c8cebce08e4348832056e87" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.076421 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.076467 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cde75d3-6c25-47b6-85fb-d7fac5c2c733-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.088066 4781 scope.go:117] "RemoveContainer" containerID="ccdb0ef00d1365df7aa818ee92651d0d59e974c7ddba4cfeac175719982d401c" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.316864 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.327452 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.339573 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:46:32 crc kubenswrapper[4781]: E1202 09:46:32.340034 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerName="proxy-httpd" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.340059 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerName="proxy-httpd" Dec 02 09:46:32 crc kubenswrapper[4781]: E1202 09:46:32.340069 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerName="ceilometer-central-agent" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.340077 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerName="ceilometer-central-agent" Dec 02 09:46:32 crc kubenswrapper[4781]: E1202 09:46:32.340102 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerName="ceilometer-notification-agent" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.340111 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerName="ceilometer-notification-agent" Dec 02 09:46:32 crc kubenswrapper[4781]: E1202 09:46:32.340132 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22badd1d-b000-4870-86cd-1186fe2ed67d" containerName="barbican-api-log" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.340140 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="22badd1d-b000-4870-86cd-1186fe2ed67d" containerName="barbican-api-log" Dec 02 09:46:32 crc kubenswrapper[4781]: E1202 09:46:32.340165 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22badd1d-b000-4870-86cd-1186fe2ed67d" containerName="barbican-api" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.340173 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="22badd1d-b000-4870-86cd-1186fe2ed67d" containerName="barbican-api" Dec 02 09:46:32 crc kubenswrapper[4781]: E1202 09:46:32.340182 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerName="sg-core" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.340189 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerName="sg-core" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.340412 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerName="sg-core" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.340430 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="22badd1d-b000-4870-86cd-1186fe2ed67d" containerName="barbican-api-log" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.340442 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="22badd1d-b000-4870-86cd-1186fe2ed67d" containerName="barbican-api" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.340464 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerName="ceilometer-central-agent" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.340478 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerName="ceilometer-notification-agent" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.340492 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" containerName="proxy-httpd" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.342553 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.344658 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.348739 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.352283 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.484337 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-config-data\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.484403 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95e3389a-92cc-4831-8858-5da7772f2959-log-httpd\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.484831 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-scripts\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.484912 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95e3389a-92cc-4831-8858-5da7772f2959-run-httpd\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.484976 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.485092 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.485134 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7vbl\" (UniqueName: \"kubernetes.io/projected/95e3389a-92cc-4831-8858-5da7772f2959-kube-api-access-f7vbl\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.587245 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-scripts\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.587301 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95e3389a-92cc-4831-8858-5da7772f2959-run-httpd\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.587332 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.587377 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.587402 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7vbl\" (UniqueName: \"kubernetes.io/projected/95e3389a-92cc-4831-8858-5da7772f2959-kube-api-access-f7vbl\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.587437 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-config-data\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.587471 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95e3389a-92cc-4831-8858-5da7772f2959-log-httpd\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.587807 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95e3389a-92cc-4831-8858-5da7772f2959-run-httpd\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.587895 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95e3389a-92cc-4831-8858-5da7772f2959-log-httpd\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.592671 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.592690 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-scripts\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.592934 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-config-data\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.593330 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.607475 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7vbl\" (UniqueName: \"kubernetes.io/projected/95e3389a-92cc-4831-8858-5da7772f2959-kube-api-access-f7vbl\") pod \"ceilometer-0\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.658375 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:46:32 crc kubenswrapper[4781]: I1202 09:46:32.989108 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zh57t" podUID="5a492eac-186c-4c73-9f70-59f989ea3169" containerName="registry-server" containerID="cri-o://f0f82bf597683b60ccb48e90f5d5ce15b005ad488a72ad27ec9dd0545da1c60c" gracePeriod=2 Dec 02 09:46:33 crc kubenswrapper[4781]: I1202 09:46:33.088813 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:46:33 crc kubenswrapper[4781]: I1202 09:46:33.513223 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cde75d3-6c25-47b6-85fb-d7fac5c2c733" path="/var/lib/kubelet/pods/9cde75d3-6c25-47b6-85fb-d7fac5c2c733/volumes" Dec 02 09:46:34 crc kubenswrapper[4781]: I1202 09:46:33.999736 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95e3389a-92cc-4831-8858-5da7772f2959","Type":"ContainerStarted","Data":"d8a303e40797780ca6ae9d10b7849bd11a54f91c210bb07d679cbfcf7ebc5882"} Dec 02 09:46:34 crc kubenswrapper[4781]: I1202 09:46:34.003253 4781 generic.go:334] "Generic (PLEG): container finished" podID="5a492eac-186c-4c73-9f70-59f989ea3169" containerID="f0f82bf597683b60ccb48e90f5d5ce15b005ad488a72ad27ec9dd0545da1c60c" exitCode=0 Dec 02 09:46:34 crc kubenswrapper[4781]: I1202 09:46:34.003288 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh57t" event={"ID":"5a492eac-186c-4c73-9f70-59f989ea3169","Type":"ContainerDied","Data":"f0f82bf597683b60ccb48e90f5d5ce15b005ad488a72ad27ec9dd0545da1c60c"} Dec 02 09:46:34 crc kubenswrapper[4781]: I1202 09:46:34.885558 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zh57t" Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.012712 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95e3389a-92cc-4831-8858-5da7772f2959","Type":"ContainerStarted","Data":"cb4a1519e147fad8ebc667748bedaf5f2d5c96a80748c234dd3df2f8b0df1cb0"} Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.014579 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh57t" event={"ID":"5a492eac-186c-4c73-9f70-59f989ea3169","Type":"ContainerDied","Data":"a71347466b2b6547c76ea2ca5398b465bfabcfea7d5c39a988a00fcbfeb31c46"} Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.014615 4781 scope.go:117] "RemoveContainer" containerID="f0f82bf597683b60ccb48e90f5d5ce15b005ad488a72ad27ec9dd0545da1c60c" Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.014656 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zh57t" Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.035229 4781 scope.go:117] "RemoveContainer" containerID="e5a83d8263639d2c65728dc121187c09b8e961fbd4c07829ea9aa572c9f49a62" Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.055645 4781 scope.go:117] "RemoveContainer" containerID="3ed58ad9d1a447eac383cdca6a5079aae04ebfc3236873e91bccc56237476d07" Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.072183 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a492eac-186c-4c73-9f70-59f989ea3169-catalog-content\") pod \"5a492eac-186c-4c73-9f70-59f989ea3169\" (UID: \"5a492eac-186c-4c73-9f70-59f989ea3169\") " Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.072335 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a492eac-186c-4c73-9f70-59f989ea3169-utilities\") pod \"5a492eac-186c-4c73-9f70-59f989ea3169\" (UID: \"5a492eac-186c-4c73-9f70-59f989ea3169\") " Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.072447 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6592\" (UniqueName: \"kubernetes.io/projected/5a492eac-186c-4c73-9f70-59f989ea3169-kube-api-access-c6592\") pod \"5a492eac-186c-4c73-9f70-59f989ea3169\" (UID: \"5a492eac-186c-4c73-9f70-59f989ea3169\") " Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.072908 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a492eac-186c-4c73-9f70-59f989ea3169-utilities" (OuterVolumeSpecName: "utilities") pod "5a492eac-186c-4c73-9f70-59f989ea3169" (UID: "5a492eac-186c-4c73-9f70-59f989ea3169"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.073231 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a492eac-186c-4c73-9f70-59f989ea3169-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.081152 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a492eac-186c-4c73-9f70-59f989ea3169-kube-api-access-c6592" (OuterVolumeSpecName: "kube-api-access-c6592") pod "5a492eac-186c-4c73-9f70-59f989ea3169" (UID: "5a492eac-186c-4c73-9f70-59f989ea3169"). InnerVolumeSpecName "kube-api-access-c6592". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.174803 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6592\" (UniqueName: \"kubernetes.io/projected/5a492eac-186c-4c73-9f70-59f989ea3169-kube-api-access-c6592\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.176624 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a492eac-186c-4c73-9f70-59f989ea3169-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a492eac-186c-4c73-9f70-59f989ea3169" (UID: "5a492eac-186c-4c73-9f70-59f989ea3169"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.276767 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a492eac-186c-4c73-9f70-59f989ea3169-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.354782 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zh57t"] Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.365428 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zh57t"] Dec 02 09:46:35 crc kubenswrapper[4781]: I1202 09:46:35.509772 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a492eac-186c-4c73-9f70-59f989ea3169" path="/var/lib/kubelet/pods/5a492eac-186c-4c73-9f70-59f989ea3169/volumes" Dec 02 09:46:40 crc kubenswrapper[4781]: I1202 09:46:40.065945 4781 generic.go:334] "Generic (PLEG): container finished" podID="1b2440b7-d6a5-426e-ac38-2f5fd531935b" containerID="82211f6afee1c1af66fd6f8198a3ea506b21910b79cc1625c9807ff23dac2d98" exitCode=0 Dec 02 09:46:40 crc kubenswrapper[4781]: I1202 09:46:40.066032 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77d7d4d6c6-nc7g8" event={"ID":"1b2440b7-d6a5-426e-ac38-2f5fd531935b","Type":"ContainerDied","Data":"82211f6afee1c1af66fd6f8198a3ea506b21910b79cc1625c9807ff23dac2d98"} Dec 02 09:46:40 crc kubenswrapper[4781]: I1202 09:46:40.109046 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:46:41 crc kubenswrapper[4781]: I1202 09:46:41.617367 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:46:41 crc kubenswrapper[4781]: I1202 09:46:41.793646 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-ovndb-tls-certs\") pod \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " Dec 02 09:46:41 crc kubenswrapper[4781]: I1202 09:46:41.794984 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-combined-ca-bundle\") pod \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " Dec 02 09:46:41 crc kubenswrapper[4781]: I1202 09:46:41.795085 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp9f2\" (UniqueName: \"kubernetes.io/projected/1b2440b7-d6a5-426e-ac38-2f5fd531935b-kube-api-access-pp9f2\") pod \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " Dec 02 09:46:41 crc kubenswrapper[4781]: I1202 09:46:41.795246 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-httpd-config\") pod \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " Dec 02 09:46:41 crc kubenswrapper[4781]: I1202 09:46:41.795681 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-config\") pod \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\" (UID: \"1b2440b7-d6a5-426e-ac38-2f5fd531935b\") " Dec 02 09:46:41 crc kubenswrapper[4781]: I1202 09:46:41.802182 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "1b2440b7-d6a5-426e-ac38-2f5fd531935b" (UID: "1b2440b7-d6a5-426e-ac38-2f5fd531935b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:41 crc kubenswrapper[4781]: I1202 09:46:41.802284 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b2440b7-d6a5-426e-ac38-2f5fd531935b-kube-api-access-pp9f2" (OuterVolumeSpecName: "kube-api-access-pp9f2") pod "1b2440b7-d6a5-426e-ac38-2f5fd531935b" (UID: "1b2440b7-d6a5-426e-ac38-2f5fd531935b"). InnerVolumeSpecName "kube-api-access-pp9f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:46:41 crc kubenswrapper[4781]: I1202 09:46:41.855494 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b2440b7-d6a5-426e-ac38-2f5fd531935b" (UID: "1b2440b7-d6a5-426e-ac38-2f5fd531935b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:41 crc kubenswrapper[4781]: I1202 09:46:41.856141 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-config" (OuterVolumeSpecName: "config") pod "1b2440b7-d6a5-426e-ac38-2f5fd531935b" (UID: "1b2440b7-d6a5-426e-ac38-2f5fd531935b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:41 crc kubenswrapper[4781]: I1202 09:46:41.885610 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "1b2440b7-d6a5-426e-ac38-2f5fd531935b" (UID: "1b2440b7-d6a5-426e-ac38-2f5fd531935b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:41 crc kubenswrapper[4781]: I1202 09:46:41.900084 4781 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:41 crc kubenswrapper[4781]: I1202 09:46:41.900117 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:41 crc kubenswrapper[4781]: I1202 09:46:41.900127 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pp9f2\" (UniqueName: \"kubernetes.io/projected/1b2440b7-d6a5-426e-ac38-2f5fd531935b-kube-api-access-pp9f2\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:41 crc kubenswrapper[4781]: I1202 09:46:41.900140 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:41 crc kubenswrapper[4781]: I1202 09:46:41.900148 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1b2440b7-d6a5-426e-ac38-2f5fd531935b-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:42 crc kubenswrapper[4781]: I1202 09:46:42.084594 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77d7d4d6c6-nc7g8" event={"ID":"1b2440b7-d6a5-426e-ac38-2f5fd531935b","Type":"ContainerDied","Data":"16ec681a6edfab8763b60f5c2859f4f379b359aa19021a312b58e8a9e5938c41"} Dec 02 09:46:42 crc kubenswrapper[4781]: I1202 09:46:42.084668 4781 scope.go:117] "RemoveContainer" containerID="9053393ea9f95fe2fac898b4da31f97b12804d5409b12558ee2677ee81437aec" Dec 02 09:46:42 crc kubenswrapper[4781]: I1202 09:46:42.084714 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77d7d4d6c6-nc7g8" Dec 02 09:46:42 crc kubenswrapper[4781]: I1202 09:46:42.101435 4781 scope.go:117] "RemoveContainer" containerID="82211f6afee1c1af66fd6f8198a3ea506b21910b79cc1625c9807ff23dac2d98" Dec 02 09:46:42 crc kubenswrapper[4781]: I1202 09:46:42.132069 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77d7d4d6c6-nc7g8"] Dec 02 09:46:42 crc kubenswrapper[4781]: I1202 09:46:42.142628 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-77d7d4d6c6-nc7g8"] Dec 02 09:46:43 crc kubenswrapper[4781]: I1202 09:46:43.098696 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95e3389a-92cc-4831-8858-5da7772f2959","Type":"ContainerStarted","Data":"5bda05124b967e04f9fcccbe72d9368877102a78eac02f7b39cfbe069a8ea529"} Dec 02 09:46:43 crc kubenswrapper[4781]: I1202 09:46:43.514895 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b2440b7-d6a5-426e-ac38-2f5fd531935b" path="/var/lib/kubelet/pods/1b2440b7-d6a5-426e-ac38-2f5fd531935b/volumes" Dec 02 09:46:44 crc kubenswrapper[4781]: I1202 09:46:44.115457 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95e3389a-92cc-4831-8858-5da7772f2959","Type":"ContainerStarted","Data":"f1dcc75cdeca2c04eac4b0ebfaea67b8a67f915db3660e9108a251ffb5310a63"} Dec 02 09:46:46 crc kubenswrapper[4781]: I1202 09:46:46.135246 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95e3389a-92cc-4831-8858-5da7772f2959","Type":"ContainerStarted","Data":"f564d268a0d5280d6852e4c1f9b3ac4799e23fbf8f2f28109237ae70a4d58b33"} Dec 02 09:46:46 crc kubenswrapper[4781]: I1202 09:46:46.135599 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95e3389a-92cc-4831-8858-5da7772f2959" containerName="ceilometer-central-agent" containerID="cri-o://cb4a1519e147fad8ebc667748bedaf5f2d5c96a80748c234dd3df2f8b0df1cb0" gracePeriod=30 Dec 02 09:46:46 crc kubenswrapper[4781]: I1202 09:46:46.135680 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 09:46:46 crc kubenswrapper[4781]: I1202 09:46:46.135785 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95e3389a-92cc-4831-8858-5da7772f2959" containerName="proxy-httpd" containerID="cri-o://f564d268a0d5280d6852e4c1f9b3ac4799e23fbf8f2f28109237ae70a4d58b33" gracePeriod=30 Dec 02 09:46:46 crc kubenswrapper[4781]: I1202 09:46:46.135831 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95e3389a-92cc-4831-8858-5da7772f2959" containerName="sg-core" containerID="cri-o://f1dcc75cdeca2c04eac4b0ebfaea67b8a67f915db3660e9108a251ffb5310a63" gracePeriod=30 Dec 02 09:46:46 crc kubenswrapper[4781]: I1202 09:46:46.135863 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95e3389a-92cc-4831-8858-5da7772f2959" containerName="ceilometer-notification-agent" containerID="cri-o://5bda05124b967e04f9fcccbe72d9368877102a78eac02f7b39cfbe069a8ea529" gracePeriod=30 Dec 02 09:46:46 crc kubenswrapper[4781]: I1202 09:46:46.163387 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.815598681 podStartE2EDuration="14.163363944s" podCreationTimestamp="2025-12-02 09:46:32 +0000 UTC" firstStartedPulling="2025-12-02 09:46:33.093477583 +0000 UTC m=+1555.917351462" lastFinishedPulling="2025-12-02 09:46:45.441242846 +0000 UTC m=+1568.265116725" observedRunningTime="2025-12-02 09:46:46.15910249 +0000 UTC m=+1568.982976369" watchObservedRunningTime="2025-12-02 09:46:46.163363944 +0000 UTC m=+1568.987237823" Dec 02 09:46:47 crc kubenswrapper[4781]: I1202 09:46:47.147473 4781 generic.go:334] "Generic (PLEG): container finished" podID="95e3389a-92cc-4831-8858-5da7772f2959" containerID="f564d268a0d5280d6852e4c1f9b3ac4799e23fbf8f2f28109237ae70a4d58b33" exitCode=0 Dec 02 09:46:47 crc kubenswrapper[4781]: I1202 09:46:47.147811 4781 generic.go:334] "Generic (PLEG): container finished" podID="95e3389a-92cc-4831-8858-5da7772f2959" containerID="f1dcc75cdeca2c04eac4b0ebfaea67b8a67f915db3660e9108a251ffb5310a63" exitCode=2 Dec 02 09:46:47 crc kubenswrapper[4781]: I1202 09:46:47.147824 4781 generic.go:334] "Generic (PLEG): container finished" podID="95e3389a-92cc-4831-8858-5da7772f2959" containerID="5bda05124b967e04f9fcccbe72d9368877102a78eac02f7b39cfbe069a8ea529" exitCode=0 Dec 02 09:46:47 crc kubenswrapper[4781]: I1202 09:46:47.147543 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95e3389a-92cc-4831-8858-5da7772f2959","Type":"ContainerDied","Data":"f564d268a0d5280d6852e4c1f9b3ac4799e23fbf8f2f28109237ae70a4d58b33"} Dec 02 09:46:47 crc kubenswrapper[4781]: I1202 09:46:47.147859 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95e3389a-92cc-4831-8858-5da7772f2959","Type":"ContainerDied","Data":"f1dcc75cdeca2c04eac4b0ebfaea67b8a67f915db3660e9108a251ffb5310a63"} Dec 02 09:46:47 crc kubenswrapper[4781]: I1202 09:46:47.147875 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95e3389a-92cc-4831-8858-5da7772f2959","Type":"ContainerDied","Data":"5bda05124b967e04f9fcccbe72d9368877102a78eac02f7b39cfbe069a8ea529"} Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.210008 4781 generic.go:334] "Generic (PLEG): container finished" podID="95e3389a-92cc-4831-8858-5da7772f2959" containerID="cb4a1519e147fad8ebc667748bedaf5f2d5c96a80748c234dd3df2f8b0df1cb0" exitCode=0 Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.210207 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95e3389a-92cc-4831-8858-5da7772f2959","Type":"ContainerDied","Data":"cb4a1519e147fad8ebc667748bedaf5f2d5c96a80748c234dd3df2f8b0df1cb0"} Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.388040 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.507976 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95e3389a-92cc-4831-8858-5da7772f2959-run-httpd\") pod \"95e3389a-92cc-4831-8858-5da7772f2959\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.508081 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-scripts\") pod \"95e3389a-92cc-4831-8858-5da7772f2959\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.508170 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-config-data\") pod \"95e3389a-92cc-4831-8858-5da7772f2959\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.508201 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7vbl\" (UniqueName: \"kubernetes.io/projected/95e3389a-92cc-4831-8858-5da7772f2959-kube-api-access-f7vbl\") pod \"95e3389a-92cc-4831-8858-5da7772f2959\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.508228 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-sg-core-conf-yaml\") pod \"95e3389a-92cc-4831-8858-5da7772f2959\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.508267 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-combined-ca-bundle\") pod \"95e3389a-92cc-4831-8858-5da7772f2959\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.508365 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95e3389a-92cc-4831-8858-5da7772f2959-log-httpd\") pod \"95e3389a-92cc-4831-8858-5da7772f2959\" (UID: \"95e3389a-92cc-4831-8858-5da7772f2959\") " Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.510731 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95e3389a-92cc-4831-8858-5da7772f2959-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "95e3389a-92cc-4831-8858-5da7772f2959" (UID: "95e3389a-92cc-4831-8858-5da7772f2959"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.511848 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95e3389a-92cc-4831-8858-5da7772f2959-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "95e3389a-92cc-4831-8858-5da7772f2959" (UID: "95e3389a-92cc-4831-8858-5da7772f2959"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.516462 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-scripts" (OuterVolumeSpecName: "scripts") pod "95e3389a-92cc-4831-8858-5da7772f2959" (UID: "95e3389a-92cc-4831-8858-5da7772f2959"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.516530 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95e3389a-92cc-4831-8858-5da7772f2959-kube-api-access-f7vbl" (OuterVolumeSpecName: "kube-api-access-f7vbl") pod "95e3389a-92cc-4831-8858-5da7772f2959" (UID: "95e3389a-92cc-4831-8858-5da7772f2959"). InnerVolumeSpecName "kube-api-access-f7vbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.548099 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "95e3389a-92cc-4831-8858-5da7772f2959" (UID: "95e3389a-92cc-4831-8858-5da7772f2959"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.590023 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95e3389a-92cc-4831-8858-5da7772f2959" (UID: "95e3389a-92cc-4831-8858-5da7772f2959"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.611544 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95e3389a-92cc-4831-8858-5da7772f2959-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.611588 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.611602 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7vbl\" (UniqueName: \"kubernetes.io/projected/95e3389a-92cc-4831-8858-5da7772f2959-kube-api-access-f7vbl\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.611616 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.611628 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.611638 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95e3389a-92cc-4831-8858-5da7772f2959-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.622833 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-config-data" (OuterVolumeSpecName: "config-data") pod "95e3389a-92cc-4831-8858-5da7772f2959" (UID: "95e3389a-92cc-4831-8858-5da7772f2959"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:46:53 crc kubenswrapper[4781]: I1202 09:46:53.713670 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95e3389a-92cc-4831-8858-5da7772f2959-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.220257 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95e3389a-92cc-4831-8858-5da7772f2959","Type":"ContainerDied","Data":"d8a303e40797780ca6ae9d10b7849bd11a54f91c210bb07d679cbfcf7ebc5882"} Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.220599 4781 scope.go:117] "RemoveContainer" containerID="f564d268a0d5280d6852e4c1f9b3ac4799e23fbf8f2f28109237ae70a4d58b33" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.220307 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.249973 4781 scope.go:117] "RemoveContainer" containerID="f1dcc75cdeca2c04eac4b0ebfaea67b8a67f915db3660e9108a251ffb5310a63" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.268068 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.275767 4781 scope.go:117] "RemoveContainer" containerID="5bda05124b967e04f9fcccbe72d9368877102a78eac02f7b39cfbe069a8ea529" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.279786 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.292549 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:46:54 crc kubenswrapper[4781]: E1202 09:46:54.293127 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95e3389a-92cc-4831-8858-5da7772f2959" containerName="ceilometer-notification-agent" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.293229 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e3389a-92cc-4831-8858-5da7772f2959" containerName="ceilometer-notification-agent" Dec 02 09:46:54 crc kubenswrapper[4781]: E1202 09:46:54.293290 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a492eac-186c-4c73-9f70-59f989ea3169" containerName="registry-server" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.293350 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a492eac-186c-4c73-9f70-59f989ea3169" containerName="registry-server" Dec 02 09:46:54 crc kubenswrapper[4781]: E1202 09:46:54.293413 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2440b7-d6a5-426e-ac38-2f5fd531935b" containerName="neutron-api" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.293469 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2440b7-d6a5-426e-ac38-2f5fd531935b" containerName="neutron-api" Dec 02 09:46:54 crc kubenswrapper[4781]: E1202 09:46:54.293545 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a492eac-186c-4c73-9f70-59f989ea3169" containerName="extract-utilities" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.293596 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a492eac-186c-4c73-9f70-59f989ea3169" containerName="extract-utilities" Dec 02 09:46:54 crc kubenswrapper[4781]: E1202 09:46:54.293664 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95e3389a-92cc-4831-8858-5da7772f2959" containerName="ceilometer-central-agent" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.293715 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e3389a-92cc-4831-8858-5da7772f2959" containerName="ceilometer-central-agent" Dec 02 09:46:54 crc kubenswrapper[4781]: E1202 09:46:54.293775 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95e3389a-92cc-4831-8858-5da7772f2959" containerName="proxy-httpd" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.293830 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e3389a-92cc-4831-8858-5da7772f2959" containerName="proxy-httpd" Dec 02 09:46:54 crc kubenswrapper[4781]: E1202 09:46:54.293903 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95e3389a-92cc-4831-8858-5da7772f2959" containerName="sg-core" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.293973 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e3389a-92cc-4831-8858-5da7772f2959" containerName="sg-core" Dec 02 09:46:54 crc kubenswrapper[4781]: E1202 09:46:54.294039 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a492eac-186c-4c73-9f70-59f989ea3169" containerName="extract-content" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.294096 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a492eac-186c-4c73-9f70-59f989ea3169" containerName="extract-content" Dec 02 09:46:54 crc kubenswrapper[4781]: E1202 09:46:54.294151 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2440b7-d6a5-426e-ac38-2f5fd531935b" containerName="neutron-httpd" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.294200 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2440b7-d6a5-426e-ac38-2f5fd531935b" containerName="neutron-httpd" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.294415 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="95e3389a-92cc-4831-8858-5da7772f2959" containerName="proxy-httpd" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.294493 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="95e3389a-92cc-4831-8858-5da7772f2959" containerName="sg-core" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.294555 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b2440b7-d6a5-426e-ac38-2f5fd531935b" containerName="neutron-api" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.294617 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="95e3389a-92cc-4831-8858-5da7772f2959" containerName="ceilometer-central-agent" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.294672 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b2440b7-d6a5-426e-ac38-2f5fd531935b" containerName="neutron-httpd" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.294732 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a492eac-186c-4c73-9f70-59f989ea3169" containerName="registry-server" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.294792 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="95e3389a-92cc-4831-8858-5da7772f2959" containerName="ceilometer-notification-agent" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.296802 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.299919 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.300230 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.305875 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.322063 4781 scope.go:117] "RemoveContainer" containerID="cb4a1519e147fad8ebc667748bedaf5f2d5c96a80748c234dd3df2f8b0df1cb0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.323489 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5c7898c-1f5a-4f97-9fcd-21991492c115-run-httpd\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.323698 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-config-data\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.323721 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.323783 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g9w2\" (UniqueName: \"kubernetes.io/projected/d5c7898c-1f5a-4f97-9fcd-21991492c115-kube-api-access-2g9w2\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.323800 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-scripts\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.323971 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5c7898c-1f5a-4f97-9fcd-21991492c115-log-httpd\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.324063 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.425958 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g9w2\" (UniqueName: \"kubernetes.io/projected/d5c7898c-1f5a-4f97-9fcd-21991492c115-kube-api-access-2g9w2\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.426008 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-scripts\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.426051 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5c7898c-1f5a-4f97-9fcd-21991492c115-log-httpd\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.426096 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.426132 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5c7898c-1f5a-4f97-9fcd-21991492c115-run-httpd\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.426202 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-config-data\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.426218 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.428951 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5c7898c-1f5a-4f97-9fcd-21991492c115-run-httpd\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.428957 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5c7898c-1f5a-4f97-9fcd-21991492c115-log-httpd\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.440527 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.440558 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-scripts\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.440565 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.441063 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-config-data\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.443362 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g9w2\" (UniqueName: \"kubernetes.io/projected/d5c7898c-1f5a-4f97-9fcd-21991492c115-kube-api-access-2g9w2\") pod \"ceilometer-0\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " pod="openstack/ceilometer-0" Dec 02 09:46:54 crc kubenswrapper[4781]: I1202 09:46:54.623706 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:46:55 crc kubenswrapper[4781]: I1202 09:46:55.086616 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:46:55 crc kubenswrapper[4781]: I1202 09:46:55.229254 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5c7898c-1f5a-4f97-9fcd-21991492c115","Type":"ContainerStarted","Data":"0d2ca335579b031cbd494da0737f7774bafb0f278f77752a5505b1062bda1454"} Dec 02 09:46:55 crc kubenswrapper[4781]: I1202 09:46:55.511144 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95e3389a-92cc-4831-8858-5da7772f2959" path="/var/lib/kubelet/pods/95e3389a-92cc-4831-8858-5da7772f2959/volumes" Dec 02 09:46:57 crc kubenswrapper[4781]: I1202 09:46:57.249505 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5c7898c-1f5a-4f97-9fcd-21991492c115","Type":"ContainerStarted","Data":"5e7147043fd04079f22ffd9cf49997c09c3e59821342c6875532387b8a17e062"} Dec 02 09:46:57 crc kubenswrapper[4781]: I1202 09:46:57.250066 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5c7898c-1f5a-4f97-9fcd-21991492c115","Type":"ContainerStarted","Data":"d913115dbd4535bb6fe470aacae8599befbf3b3c5f90c67486b7e6b5a64042ad"} Dec 02 09:46:58 crc kubenswrapper[4781]: I1202 09:46:58.260041 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5c7898c-1f5a-4f97-9fcd-21991492c115","Type":"ContainerStarted","Data":"d7f9f98f9981d817aa9d1b3300dd784c5c9b35b92acff849138ef40a4dea5dc7"} Dec 02 09:47:00 crc kubenswrapper[4781]: I1202 09:47:00.411946 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:47:00 crc kubenswrapper[4781]: I1202 09:47:00.412008 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:47:00 crc kubenswrapper[4781]: I1202 09:47:00.485960 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:47:01 crc kubenswrapper[4781]: I1202 09:47:01.288137 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5c7898c-1f5a-4f97-9fcd-21991492c115","Type":"ContainerStarted","Data":"abf805839251d5d25830ef2daf977d580d7247ba7d4efa67e6ce450c76268386"} Dec 02 09:47:01 crc kubenswrapper[4781]: I1202 09:47:01.288302 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerName="ceilometer-central-agent" containerID="cri-o://d913115dbd4535bb6fe470aacae8599befbf3b3c5f90c67486b7e6b5a64042ad" gracePeriod=30 Dec 02 09:47:01 crc kubenswrapper[4781]: I1202 09:47:01.288365 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerName="ceilometer-notification-agent" containerID="cri-o://5e7147043fd04079f22ffd9cf49997c09c3e59821342c6875532387b8a17e062" gracePeriod=30 Dec 02 09:47:01 crc kubenswrapper[4781]: I1202 09:47:01.288342 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerName="proxy-httpd" containerID="cri-o://abf805839251d5d25830ef2daf977d580d7247ba7d4efa67e6ce450c76268386" gracePeriod=30 Dec 02 09:47:01 crc kubenswrapper[4781]: I1202 09:47:01.288399 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerName="sg-core" containerID="cri-o://d7f9f98f9981d817aa9d1b3300dd784c5c9b35b92acff849138ef40a4dea5dc7" gracePeriod=30 Dec 02 09:47:01 crc kubenswrapper[4781]: I1202 09:47:01.289129 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 09:47:01 crc kubenswrapper[4781]: I1202 09:47:01.314630 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.03968253 podStartE2EDuration="7.314612355s" podCreationTimestamp="2025-12-02 09:46:54 +0000 UTC" firstStartedPulling="2025-12-02 09:46:55.090117063 +0000 UTC m=+1577.913990952" lastFinishedPulling="2025-12-02 09:47:00.365046898 +0000 UTC m=+1583.188920777" observedRunningTime="2025-12-02 09:47:01.312205542 +0000 UTC m=+1584.136079421" watchObservedRunningTime="2025-12-02 09:47:01.314612355 +0000 UTC m=+1584.138486234" Dec 02 09:47:02 crc kubenswrapper[4781]: I1202 09:47:02.298712 4781 generic.go:334] "Generic (PLEG): container finished" podID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerID="abf805839251d5d25830ef2daf977d580d7247ba7d4efa67e6ce450c76268386" exitCode=0 Dec 02 09:47:02 crc kubenswrapper[4781]: I1202 09:47:02.299026 4781 generic.go:334] "Generic (PLEG): container finished" podID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerID="d7f9f98f9981d817aa9d1b3300dd784c5c9b35b92acff849138ef40a4dea5dc7" exitCode=2 Dec 02 09:47:02 crc kubenswrapper[4781]: I1202 09:47:02.299037 4781 generic.go:334] "Generic (PLEG): container finished" podID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerID="5e7147043fd04079f22ffd9cf49997c09c3e59821342c6875532387b8a17e062" exitCode=0 Dec 02 09:47:02 crc kubenswrapper[4781]: I1202 09:47:02.298789 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5c7898c-1f5a-4f97-9fcd-21991492c115","Type":"ContainerDied","Data":"abf805839251d5d25830ef2daf977d580d7247ba7d4efa67e6ce450c76268386"} Dec 02 09:47:02 crc kubenswrapper[4781]: I1202 09:47:02.299103 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5c7898c-1f5a-4f97-9fcd-21991492c115","Type":"ContainerDied","Data":"d7f9f98f9981d817aa9d1b3300dd784c5c9b35b92acff849138ef40a4dea5dc7"} Dec 02 09:47:02 crc kubenswrapper[4781]: I1202 09:47:02.299117 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5c7898c-1f5a-4f97-9fcd-21991492c115","Type":"ContainerDied","Data":"5e7147043fd04079f22ffd9cf49997c09c3e59821342c6875532387b8a17e062"} Dec 02 09:47:02 crc kubenswrapper[4781]: I1202 09:47:02.300573 4781 generic.go:334] "Generic (PLEG): container finished" podID="41de14f6-deed-478b-9d75-ad94ab88ee05" containerID="65ce496472d4caa27e5c6be55c9ff97db07abac06c00e5140f86690226b00b55" exitCode=0 Dec 02 09:47:02 crc kubenswrapper[4781]: I1202 09:47:02.300608 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-972b4" event={"ID":"41de14f6-deed-478b-9d75-ad94ab88ee05","Type":"ContainerDied","Data":"65ce496472d4caa27e5c6be55c9ff97db07abac06c00e5140f86690226b00b55"} Dec 02 09:47:03 crc kubenswrapper[4781]: I1202 09:47:03.660205 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-972b4" Dec 02 09:47:03 crc kubenswrapper[4781]: I1202 09:47:03.803354 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-scripts\") pod \"41de14f6-deed-478b-9d75-ad94ab88ee05\" (UID: \"41de14f6-deed-478b-9d75-ad94ab88ee05\") " Dec 02 09:47:03 crc kubenswrapper[4781]: I1202 09:47:03.803739 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76sqz\" (UniqueName: \"kubernetes.io/projected/41de14f6-deed-478b-9d75-ad94ab88ee05-kube-api-access-76sqz\") pod \"41de14f6-deed-478b-9d75-ad94ab88ee05\" (UID: \"41de14f6-deed-478b-9d75-ad94ab88ee05\") " Dec 02 09:47:03 crc kubenswrapper[4781]: I1202 09:47:03.803917 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-config-data\") pod \"41de14f6-deed-478b-9d75-ad94ab88ee05\" (UID: \"41de14f6-deed-478b-9d75-ad94ab88ee05\") " Dec 02 09:47:03 crc kubenswrapper[4781]: I1202 09:47:03.803988 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-combined-ca-bundle\") pod \"41de14f6-deed-478b-9d75-ad94ab88ee05\" (UID: \"41de14f6-deed-478b-9d75-ad94ab88ee05\") " Dec 02 09:47:03 crc kubenswrapper[4781]: I1202 09:47:03.812420 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-scripts" (OuterVolumeSpecName: "scripts") pod "41de14f6-deed-478b-9d75-ad94ab88ee05" (UID: "41de14f6-deed-478b-9d75-ad94ab88ee05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:03 crc kubenswrapper[4781]: I1202 09:47:03.812890 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41de14f6-deed-478b-9d75-ad94ab88ee05-kube-api-access-76sqz" (OuterVolumeSpecName: "kube-api-access-76sqz") pod "41de14f6-deed-478b-9d75-ad94ab88ee05" (UID: "41de14f6-deed-478b-9d75-ad94ab88ee05"). InnerVolumeSpecName "kube-api-access-76sqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:47:03 crc kubenswrapper[4781]: I1202 09:47:03.834572 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41de14f6-deed-478b-9d75-ad94ab88ee05" (UID: "41de14f6-deed-478b-9d75-ad94ab88ee05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:03 crc kubenswrapper[4781]: I1202 09:47:03.838739 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-config-data" (OuterVolumeSpecName: "config-data") pod "41de14f6-deed-478b-9d75-ad94ab88ee05" (UID: "41de14f6-deed-478b-9d75-ad94ab88ee05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:03 crc kubenswrapper[4781]: I1202 09:47:03.906412 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:03 crc kubenswrapper[4781]: I1202 09:47:03.906447 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:03 crc kubenswrapper[4781]: I1202 09:47:03.906457 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76sqz\" (UniqueName: \"kubernetes.io/projected/41de14f6-deed-478b-9d75-ad94ab88ee05-kube-api-access-76sqz\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:03 crc kubenswrapper[4781]: I1202 09:47:03.906471 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41de14f6-deed-478b-9d75-ad94ab88ee05-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.336221 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-972b4" event={"ID":"41de14f6-deed-478b-9d75-ad94ab88ee05","Type":"ContainerDied","Data":"d9c49c5f80b52e6242da4338ac28cd670a85d5ddbf4b66ec818f0dd84eb46fd9"} Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.336263 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9c49c5f80b52e6242da4338ac28cd670a85d5ddbf4b66ec818f0dd84eb46fd9" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.336341 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-972b4" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.430902 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 09:47:04 crc kubenswrapper[4781]: E1202 09:47:04.431390 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41de14f6-deed-478b-9d75-ad94ab88ee05" containerName="nova-cell0-conductor-db-sync" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.431418 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="41de14f6-deed-478b-9d75-ad94ab88ee05" containerName="nova-cell0-conductor-db-sync" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.431696 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="41de14f6-deed-478b-9d75-ad94ab88ee05" containerName="nova-cell0-conductor-db-sync" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.432522 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.436102 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-n5wrr" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.436462 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.440689 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.528652 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56b3a068-08e6-4567-b09b-4050ad8f1a65-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"56b3a068-08e6-4567-b09b-4050ad8f1a65\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.528738 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvxz8\" (UniqueName: \"kubernetes.io/projected/56b3a068-08e6-4567-b09b-4050ad8f1a65-kube-api-access-wvxz8\") pod \"nova-cell0-conductor-0\" (UID: \"56b3a068-08e6-4567-b09b-4050ad8f1a65\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.528760 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b3a068-08e6-4567-b09b-4050ad8f1a65-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"56b3a068-08e6-4567-b09b-4050ad8f1a65\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.630383 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvxz8\" (UniqueName: \"kubernetes.io/projected/56b3a068-08e6-4567-b09b-4050ad8f1a65-kube-api-access-wvxz8\") pod \"nova-cell0-conductor-0\" (UID: \"56b3a068-08e6-4567-b09b-4050ad8f1a65\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.630743 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b3a068-08e6-4567-b09b-4050ad8f1a65-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"56b3a068-08e6-4567-b09b-4050ad8f1a65\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.631004 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56b3a068-08e6-4567-b09b-4050ad8f1a65-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"56b3a068-08e6-4567-b09b-4050ad8f1a65\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.646855 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b3a068-08e6-4567-b09b-4050ad8f1a65-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"56b3a068-08e6-4567-b09b-4050ad8f1a65\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.646939 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56b3a068-08e6-4567-b09b-4050ad8f1a65-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"56b3a068-08e6-4567-b09b-4050ad8f1a65\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.652697 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvxz8\" (UniqueName: \"kubernetes.io/projected/56b3a068-08e6-4567-b09b-4050ad8f1a65-kube-api-access-wvxz8\") pod \"nova-cell0-conductor-0\" (UID: \"56b3a068-08e6-4567-b09b-4050ad8f1a65\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:47:04 crc kubenswrapper[4781]: I1202 09:47:04.758504 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 09:47:05 crc kubenswrapper[4781]: I1202 09:47:05.263722 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 09:47:05 crc kubenswrapper[4781]: W1202 09:47:05.272190 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56b3a068_08e6_4567_b09b_4050ad8f1a65.slice/crio-97a67c9b01d46291b8fe0b535a8bd2e47298b69eb631b4f3de9005c42d631668 WatchSource:0}: Error finding container 97a67c9b01d46291b8fe0b535a8bd2e47298b69eb631b4f3de9005c42d631668: Status 404 returned error can't find the container with id 97a67c9b01d46291b8fe0b535a8bd2e47298b69eb631b4f3de9005c42d631668 Dec 02 09:47:05 crc kubenswrapper[4781]: I1202 09:47:05.346530 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"56b3a068-08e6-4567-b09b-4050ad8f1a65","Type":"ContainerStarted","Data":"97a67c9b01d46291b8fe0b535a8bd2e47298b69eb631b4f3de9005c42d631668"} Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.360105 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.360538 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"56b3a068-08e6-4567-b09b-4050ad8f1a65","Type":"ContainerStarted","Data":"c2ed26f0bead9f88f8a3064bb965ebc66db5673ca4cde91ec06b7a3c312cba49"} Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.365524 4781 generic.go:334] "Generic (PLEG): container finished" podID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerID="d913115dbd4535bb6fe470aacae8599befbf3b3c5f90c67486b7e6b5a64042ad" exitCode=0 Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.365571 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5c7898c-1f5a-4f97-9fcd-21991492c115","Type":"ContainerDied","Data":"d913115dbd4535bb6fe470aacae8599befbf3b3c5f90c67486b7e6b5a64042ad"} Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.542439 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.571401 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.571380318 podStartE2EDuration="2.571380318s" podCreationTimestamp="2025-12-02 09:47:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:47:06.381276591 +0000 UTC m=+1589.205150470" watchObservedRunningTime="2025-12-02 09:47:06.571380318 +0000 UTC m=+1589.395254197" Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.670369 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5c7898c-1f5a-4f97-9fcd-21991492c115-run-httpd\") pod \"d5c7898c-1f5a-4f97-9fcd-21991492c115\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.670456 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-sg-core-conf-yaml\") pod \"d5c7898c-1f5a-4f97-9fcd-21991492c115\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.670552 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-config-data\") pod \"d5c7898c-1f5a-4f97-9fcd-21991492c115\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.670626 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-combined-ca-bundle\") pod \"d5c7898c-1f5a-4f97-9fcd-21991492c115\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.670688 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g9w2\" (UniqueName: \"kubernetes.io/projected/d5c7898c-1f5a-4f97-9fcd-21991492c115-kube-api-access-2g9w2\") pod \"d5c7898c-1f5a-4f97-9fcd-21991492c115\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.670747 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5c7898c-1f5a-4f97-9fcd-21991492c115-log-httpd\") pod \"d5c7898c-1f5a-4f97-9fcd-21991492c115\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.670782 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-scripts\") pod \"d5c7898c-1f5a-4f97-9fcd-21991492c115\" (UID: \"d5c7898c-1f5a-4f97-9fcd-21991492c115\") " Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.672361 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5c7898c-1f5a-4f97-9fcd-21991492c115-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d5c7898c-1f5a-4f97-9fcd-21991492c115" (UID: "d5c7898c-1f5a-4f97-9fcd-21991492c115"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.673821 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5c7898c-1f5a-4f97-9fcd-21991492c115-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d5c7898c-1f5a-4f97-9fcd-21991492c115" (UID: "d5c7898c-1f5a-4f97-9fcd-21991492c115"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.683098 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-scripts" (OuterVolumeSpecName: "scripts") pod "d5c7898c-1f5a-4f97-9fcd-21991492c115" (UID: "d5c7898c-1f5a-4f97-9fcd-21991492c115"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.683287 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5c7898c-1f5a-4f97-9fcd-21991492c115-kube-api-access-2g9w2" (OuterVolumeSpecName: "kube-api-access-2g9w2") pod "d5c7898c-1f5a-4f97-9fcd-21991492c115" (UID: "d5c7898c-1f5a-4f97-9fcd-21991492c115"). InnerVolumeSpecName "kube-api-access-2g9w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.708412 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d5c7898c-1f5a-4f97-9fcd-21991492c115" (UID: "d5c7898c-1f5a-4f97-9fcd-21991492c115"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.757837 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5c7898c-1f5a-4f97-9fcd-21991492c115" (UID: "d5c7898c-1f5a-4f97-9fcd-21991492c115"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.773349 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g9w2\" (UniqueName: \"kubernetes.io/projected/d5c7898c-1f5a-4f97-9fcd-21991492c115-kube-api-access-2g9w2\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.773469 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5c7898c-1f5a-4f97-9fcd-21991492c115-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.773488 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.773500 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5c7898c-1f5a-4f97-9fcd-21991492c115-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.773513 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.773526 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.783011 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-config-data" (OuterVolumeSpecName: "config-data") pod "d5c7898c-1f5a-4f97-9fcd-21991492c115" (UID: "d5c7898c-1f5a-4f97-9fcd-21991492c115"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:06 crc kubenswrapper[4781]: I1202 09:47:06.876896 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5c7898c-1f5a-4f97-9fcd-21991492c115-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.376707 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.377057 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5c7898c-1f5a-4f97-9fcd-21991492c115","Type":"ContainerDied","Data":"0d2ca335579b031cbd494da0737f7774bafb0f278f77752a5505b1062bda1454"} Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.377096 4781 scope.go:117] "RemoveContainer" containerID="abf805839251d5d25830ef2daf977d580d7247ba7d4efa67e6ce450c76268386" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.398852 4781 scope.go:117] "RemoveContainer" containerID="d7f9f98f9981d817aa9d1b3300dd784c5c9b35b92acff849138ef40a4dea5dc7" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.416423 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.448604 4781 scope.go:117] "RemoveContainer" containerID="5e7147043fd04079f22ffd9cf49997c09c3e59821342c6875532387b8a17e062" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.473056 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.487324 4781 scope.go:117] "RemoveContainer" containerID="d913115dbd4535bb6fe470aacae8599befbf3b3c5f90c67486b7e6b5a64042ad" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.488202 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:47:07 crc kubenswrapper[4781]: E1202 09:47:07.489079 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerName="ceilometer-central-agent" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.489228 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerName="ceilometer-central-agent" Dec 02 09:47:07 crc kubenswrapper[4781]: E1202 09:47:07.489342 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerName="ceilometer-notification-agent" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.489409 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerName="ceilometer-notification-agent" Dec 02 09:47:07 crc kubenswrapper[4781]: E1202 09:47:07.489557 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerName="sg-core" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.490085 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerName="sg-core" Dec 02 09:47:07 crc kubenswrapper[4781]: E1202 09:47:07.490243 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerName="proxy-httpd" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.490328 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerName="proxy-httpd" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.491509 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerName="ceilometer-notification-agent" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.491547 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerName="sg-core" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.491575 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerName="ceilometer-central-agent" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.491596 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c7898c-1f5a-4f97-9fcd-21991492c115" containerName="proxy-httpd" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.501513 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.512162 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.512164 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.516898 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5c7898c-1f5a-4f97-9fcd-21991492c115" path="/var/lib/kubelet/pods/d5c7898c-1f5a-4f97-9fcd-21991492c115/volumes" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.518045 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.593731 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj7c9\" (UniqueName: \"kubernetes.io/projected/2f695df8-9072-4528-891a-78e212a41af6-kube-api-access-pj7c9\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.593912 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f695df8-9072-4528-891a-78e212a41af6-run-httpd\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.593971 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-config-data\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.594354 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f695df8-9072-4528-891a-78e212a41af6-log-httpd\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.594387 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.594533 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.594586 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-scripts\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.697214 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f695df8-9072-4528-891a-78e212a41af6-run-httpd\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.697281 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-config-data\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.697334 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f695df8-9072-4528-891a-78e212a41af6-log-httpd\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.697354 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.697431 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.697461 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-scripts\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.697526 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj7c9\" (UniqueName: \"kubernetes.io/projected/2f695df8-9072-4528-891a-78e212a41af6-kube-api-access-pj7c9\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.698375 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f695df8-9072-4528-891a-78e212a41af6-run-httpd\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.699538 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f695df8-9072-4528-891a-78e212a41af6-log-httpd\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.708822 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-scripts\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.712397 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.715149 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-config-data\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.715807 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.717117 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj7c9\" (UniqueName: \"kubernetes.io/projected/2f695df8-9072-4528-891a-78e212a41af6-kube-api-access-pj7c9\") pod \"ceilometer-0\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " pod="openstack/ceilometer-0" Dec 02 09:47:07 crc kubenswrapper[4781]: I1202 09:47:07.834759 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:47:08 crc kubenswrapper[4781]: I1202 09:47:08.346985 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:47:08 crc kubenswrapper[4781]: I1202 09:47:08.388080 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f695df8-9072-4528-891a-78e212a41af6","Type":"ContainerStarted","Data":"4e28d630ed8d8741790a7f07941812017e9516bd93973592045a4d3216e873be"} Dec 02 09:47:09 crc kubenswrapper[4781]: I1202 09:47:09.400617 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f695df8-9072-4528-891a-78e212a41af6","Type":"ContainerStarted","Data":"58170e2dbf418d5cf29396500c9d93623007dc60a7a701597b03f9c102048e26"} Dec 02 09:47:10 crc kubenswrapper[4781]: I1202 09:47:10.422355 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f695df8-9072-4528-891a-78e212a41af6","Type":"ContainerStarted","Data":"2a54635b26f067dc174fe21eb06cef545561aa10ac6a76511f58cce7f9584c06"} Dec 02 09:47:11 crc kubenswrapper[4781]: I1202 09:47:11.434014 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f695df8-9072-4528-891a-78e212a41af6","Type":"ContainerStarted","Data":"0cb7c95978209aeffebb9731c513878b8ce90cc6f16d59fa6bfcd78ee73d4af4"} Dec 02 09:47:14 crc kubenswrapper[4781]: I1202 09:47:14.474462 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f695df8-9072-4528-891a-78e212a41af6","Type":"ContainerStarted","Data":"e83d966537f5027fcedea16bcd5203bfe0c7321ba4c7f0f64f9bbc8cff4124ef"} Dec 02 09:47:14 crc kubenswrapper[4781]: I1202 09:47:14.475264 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 09:47:14 crc kubenswrapper[4781]: I1202 09:47:14.512693 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.568760724 podStartE2EDuration="7.512667105s" podCreationTimestamp="2025-12-02 09:47:07 +0000 UTC" firstStartedPulling="2025-12-02 09:47:08.358620356 +0000 UTC m=+1591.182494235" lastFinishedPulling="2025-12-02 09:47:13.302526747 +0000 UTC m=+1596.126400616" observedRunningTime="2025-12-02 09:47:14.503585033 +0000 UTC m=+1597.327458912" watchObservedRunningTime="2025-12-02 09:47:14.512667105 +0000 UTC m=+1597.336540984" Dec 02 09:47:14 crc kubenswrapper[4781]: I1202 09:47:14.787654 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.252901 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-fwtp2"] Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.254549 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fwtp2" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.262074 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.262277 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.302854 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fwtp2"] Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.347062 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fwtp2\" (UID: \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\") " pod="openstack/nova-cell0-cell-mapping-fwtp2" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.347143 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-config-data\") pod \"nova-cell0-cell-mapping-fwtp2\" (UID: \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\") " pod="openstack/nova-cell0-cell-mapping-fwtp2" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.347221 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjb9l\" (UniqueName: \"kubernetes.io/projected/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-kube-api-access-zjb9l\") pod \"nova-cell0-cell-mapping-fwtp2\" (UID: \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\") " pod="openstack/nova-cell0-cell-mapping-fwtp2" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.347241 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-scripts\") pod \"nova-cell0-cell-mapping-fwtp2\" (UID: \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\") " pod="openstack/nova-cell0-cell-mapping-fwtp2" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.441819 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.443351 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.450821 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fwtp2\" (UID: \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\") " pod="openstack/nova-cell0-cell-mapping-fwtp2" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.450903 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-config-data\") pod \"nova-cell0-cell-mapping-fwtp2\" (UID: \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\") " pod="openstack/nova-cell0-cell-mapping-fwtp2" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.451021 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjb9l\" (UniqueName: \"kubernetes.io/projected/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-kube-api-access-zjb9l\") pod \"nova-cell0-cell-mapping-fwtp2\" (UID: \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\") " pod="openstack/nova-cell0-cell-mapping-fwtp2" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.451049 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-scripts\") pod \"nova-cell0-cell-mapping-fwtp2\" (UID: \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\") " pod="openstack/nova-cell0-cell-mapping-fwtp2" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.451647 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.461287 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fwtp2\" (UID: \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\") " pod="openstack/nova-cell0-cell-mapping-fwtp2" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.463958 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.464602 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-scripts\") pod \"nova-cell0-cell-mapping-fwtp2\" (UID: \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\") " pod="openstack/nova-cell0-cell-mapping-fwtp2" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.465496 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.471474 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.492327 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-config-data\") pod \"nova-cell0-cell-mapping-fwtp2\" (UID: \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\") " pod="openstack/nova-cell0-cell-mapping-fwtp2" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.493683 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.615786 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hg8k\" (UniqueName: \"kubernetes.io/projected/c59b1260-94b3-4f66-942d-d215e1c7715c-kube-api-access-6hg8k\") pod \"nova-cell1-novncproxy-0\" (UID: \"c59b1260-94b3-4f66-942d-d215e1c7715c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.616075 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59b1260-94b3-4f66-942d-d215e1c7715c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c59b1260-94b3-4f66-942d-d215e1c7715c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.616271 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59b1260-94b3-4f66-942d-d215e1c7715c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c59b1260-94b3-4f66-942d-d215e1c7715c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.619978 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjb9l\" (UniqueName: \"kubernetes.io/projected/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-kube-api-access-zjb9l\") pod \"nova-cell0-cell-mapping-fwtp2\" (UID: \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\") " pod="openstack/nova-cell0-cell-mapping-fwtp2" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.721160 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hg8k\" (UniqueName: \"kubernetes.io/projected/c59b1260-94b3-4f66-942d-d215e1c7715c-kube-api-access-6hg8k\") pod \"nova-cell1-novncproxy-0\" (UID: \"c59b1260-94b3-4f66-942d-d215e1c7715c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.721250 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59b1260-94b3-4f66-942d-d215e1c7715c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c59b1260-94b3-4f66-942d-d215e1c7715c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.721327 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59b1260-94b3-4f66-942d-d215e1c7715c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c59b1260-94b3-4f66-942d-d215e1c7715c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.750236 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.751693 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59b1260-94b3-4f66-942d-d215e1c7715c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c59b1260-94b3-4f66-942d-d215e1c7715c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.781647 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hg8k\" (UniqueName: \"kubernetes.io/projected/c59b1260-94b3-4f66-942d-d215e1c7715c-kube-api-access-6hg8k\") pod \"nova-cell1-novncproxy-0\" (UID: \"c59b1260-94b3-4f66-942d-d215e1c7715c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.804682 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.816797 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.844541 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.845844 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\") " pod="openstack/nova-metadata-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.845904 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-config-data\") pod \"nova-metadata-0\" (UID: \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\") " pod="openstack/nova-metadata-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.845951 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f72c2f91-025d-405a-8a67-fe72c205dda3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f72c2f91-025d-405a-8a67-fe72c205dda3\") " pod="openstack/nova-api-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.845965 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f72c2f91-025d-405a-8a67-fe72c205dda3-config-data\") pod \"nova-api-0\" (UID: \"f72c2f91-025d-405a-8a67-fe72c205dda3\") " pod="openstack/nova-api-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.846002 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f72c2f91-025d-405a-8a67-fe72c205dda3-logs\") pod \"nova-api-0\" (UID: \"f72c2f91-025d-405a-8a67-fe72c205dda3\") " pod="openstack/nova-api-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.846014 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-logs\") pod \"nova-metadata-0\" (UID: \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\") " pod="openstack/nova-metadata-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.846119 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hxxl\" (UniqueName: \"kubernetes.io/projected/f72c2f91-025d-405a-8a67-fe72c205dda3-kube-api-access-5hxxl\") pod \"nova-api-0\" (UID: \"f72c2f91-025d-405a-8a67-fe72c205dda3\") " pod="openstack/nova-api-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.846144 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfr8j\" (UniqueName: \"kubernetes.io/projected/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-kube-api-access-qfr8j\") pod \"nova-metadata-0\" (UID: \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\") " pod="openstack/nova-metadata-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.859234 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59b1260-94b3-4f66-942d-d215e1c7715c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c59b1260-94b3-4f66-942d-d215e1c7715c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.898555 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fwtp2" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.925962 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.947590 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\") " pod="openstack/nova-metadata-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.947638 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-config-data\") pod \"nova-metadata-0\" (UID: \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\") " pod="openstack/nova-metadata-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.947667 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f72c2f91-025d-405a-8a67-fe72c205dda3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f72c2f91-025d-405a-8a67-fe72c205dda3\") " pod="openstack/nova-api-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.947686 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f72c2f91-025d-405a-8a67-fe72c205dda3-config-data\") pod \"nova-api-0\" (UID: \"f72c2f91-025d-405a-8a67-fe72c205dda3\") " pod="openstack/nova-api-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.947719 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f72c2f91-025d-405a-8a67-fe72c205dda3-logs\") pod \"nova-api-0\" (UID: \"f72c2f91-025d-405a-8a67-fe72c205dda3\") " pod="openstack/nova-api-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.947733 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-logs\") pod \"nova-metadata-0\" (UID: \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\") " pod="openstack/nova-metadata-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.947811 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hxxl\" (UniqueName: \"kubernetes.io/projected/f72c2f91-025d-405a-8a67-fe72c205dda3-kube-api-access-5hxxl\") pod \"nova-api-0\" (UID: \"f72c2f91-025d-405a-8a67-fe72c205dda3\") " pod="openstack/nova-api-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.947845 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfr8j\" (UniqueName: \"kubernetes.io/projected/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-kube-api-access-qfr8j\") pod \"nova-metadata-0\" (UID: \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\") " pod="openstack/nova-metadata-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.949044 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-logs\") pod \"nova-metadata-0\" (UID: \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\") " pod="openstack/nova-metadata-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.949297 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f72c2f91-025d-405a-8a67-fe72c205dda3-logs\") pod \"nova-api-0\" (UID: \"f72c2f91-025d-405a-8a67-fe72c205dda3\") " pod="openstack/nova-api-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.959188 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.963677 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f72c2f91-025d-405a-8a67-fe72c205dda3-config-data\") pod \"nova-api-0\" (UID: \"f72c2f91-025d-405a-8a67-fe72c205dda3\") " pod="openstack/nova-api-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.970670 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-config-data\") pod \"nova-metadata-0\" (UID: \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\") " pod="openstack/nova-metadata-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.974499 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f72c2f91-025d-405a-8a67-fe72c205dda3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f72c2f91-025d-405a-8a67-fe72c205dda3\") " pod="openstack/nova-api-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.975182 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\") " pod="openstack/nova-metadata-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.976480 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.980732 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 09:47:15 crc kubenswrapper[4781]: I1202 09:47:15.999358 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:15.999581 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.032350 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hxxl\" (UniqueName: \"kubernetes.io/projected/f72c2f91-025d-405a-8a67-fe72c205dda3-kube-api-access-5hxxl\") pod \"nova-api-0\" (UID: \"f72c2f91-025d-405a-8a67-fe72c205dda3\") " pod="openstack/nova-api-0" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.042995 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-pxpg7"] Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.045418 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.045584 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.046737 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfr8j\" (UniqueName: \"kubernetes.io/projected/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-kube-api-access-qfr8j\") pod \"nova-metadata-0\" (UID: \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\") " pod="openstack/nova-metadata-0" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.054412 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-dns-svc\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.054471 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.054490 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.054510 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.054542 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6n2w\" (UniqueName: \"kubernetes.io/projected/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-kube-api-access-g6n2w\") pod \"nova-scheduler-0\" (UID: \"4ac5dc21-2c35-4977-9ed8-c1af566af6d5\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.054582 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-config-data\") pod \"nova-scheduler-0\" (UID: \"4ac5dc21-2c35-4977-9ed8-c1af566af6d5\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.054619 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4ac5dc21-2c35-4977-9ed8-c1af566af6d5\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.054692 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-config\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.058367 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-pxpg7"] Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.161196 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfvfk\" (UniqueName: \"kubernetes.io/projected/9045104f-d244-43d4-850a-68e15766bd31-kube-api-access-jfvfk\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.162071 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-dns-svc\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.162170 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.162196 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.162229 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.162273 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6n2w\" (UniqueName: \"kubernetes.io/projected/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-kube-api-access-g6n2w\") pod \"nova-scheduler-0\" (UID: \"4ac5dc21-2c35-4977-9ed8-c1af566af6d5\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.162358 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-config-data\") pod \"nova-scheduler-0\" (UID: \"4ac5dc21-2c35-4977-9ed8-c1af566af6d5\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.162435 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4ac5dc21-2c35-4977-9ed8-c1af566af6d5\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.162553 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-config\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.164066 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-config\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.164708 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-dns-svc\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.166042 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.166627 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.173368 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4ac5dc21-2c35-4977-9ed8-c1af566af6d5\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.173824 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-config-data\") pod \"nova-scheduler-0\" (UID: \"4ac5dc21-2c35-4977-9ed8-c1af566af6d5\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.177683 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.187599 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.209851 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6n2w\" (UniqueName: \"kubernetes.io/projected/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-kube-api-access-g6n2w\") pod \"nova-scheduler-0\" (UID: \"4ac5dc21-2c35-4977-9ed8-c1af566af6d5\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.265547 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfvfk\" (UniqueName: \"kubernetes.io/projected/9045104f-d244-43d4-850a-68e15766bd31-kube-api-access-jfvfk\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.296728 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfvfk\" (UniqueName: \"kubernetes.io/projected/9045104f-d244-43d4-850a-68e15766bd31-kube-api-access-jfvfk\") pod \"dnsmasq-dns-757b4f8459-pxpg7\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.353984 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.392443 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.693557 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fwtp2"] Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.858710 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 09:47:16 crc kubenswrapper[4781]: I1202 09:47:16.947426 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.072671 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qftr6"] Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.093061 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qftr6" Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.103326 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.103971 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.110256 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qftr6"] Dec 02 09:47:17 crc kubenswrapper[4781]: W1202 09:47:17.123020 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9045104f_d244_43d4_850a_68e15766bd31.slice/crio-b8e52ceac0961ddb152a960f8e02e797eb35a80fef11d57d6dafa4b4a933be16 WatchSource:0}: Error finding container b8e52ceac0961ddb152a960f8e02e797eb35a80fef11d57d6dafa4b4a933be16: Status 404 returned error can't find the container with id b8e52ceac0961ddb152a960f8e02e797eb35a80fef11d57d6dafa4b4a933be16 Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.156498 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.169621 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-pxpg7"] Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.180472 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.194996 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qftr6\" (UID: \"a9215fee-a493-48f3-a67c-21fbb4256f55\") " pod="openstack/nova-cell1-conductor-db-sync-qftr6" Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.195097 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-config-data\") pod \"nova-cell1-conductor-db-sync-qftr6\" (UID: \"a9215fee-a493-48f3-a67c-21fbb4256f55\") " pod="openstack/nova-cell1-conductor-db-sync-qftr6" Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.195191 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fss6d\" (UniqueName: \"kubernetes.io/projected/a9215fee-a493-48f3-a67c-21fbb4256f55-kube-api-access-fss6d\") pod \"nova-cell1-conductor-db-sync-qftr6\" (UID: \"a9215fee-a493-48f3-a67c-21fbb4256f55\") " pod="openstack/nova-cell1-conductor-db-sync-qftr6" Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.195290 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-scripts\") pod \"nova-cell1-conductor-db-sync-qftr6\" (UID: \"a9215fee-a493-48f3-a67c-21fbb4256f55\") " pod="openstack/nova-cell1-conductor-db-sync-qftr6" Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.302274 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qftr6\" (UID: \"a9215fee-a493-48f3-a67c-21fbb4256f55\") " pod="openstack/nova-cell1-conductor-db-sync-qftr6" Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.302418 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-config-data\") pod \"nova-cell1-conductor-db-sync-qftr6\" (UID: \"a9215fee-a493-48f3-a67c-21fbb4256f55\") " pod="openstack/nova-cell1-conductor-db-sync-qftr6" Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.302581 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fss6d\" (UniqueName: \"kubernetes.io/projected/a9215fee-a493-48f3-a67c-21fbb4256f55-kube-api-access-fss6d\") pod \"nova-cell1-conductor-db-sync-qftr6\" (UID: \"a9215fee-a493-48f3-a67c-21fbb4256f55\") " pod="openstack/nova-cell1-conductor-db-sync-qftr6" Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.303042 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-scripts\") pod \"nova-cell1-conductor-db-sync-qftr6\" (UID: \"a9215fee-a493-48f3-a67c-21fbb4256f55\") " pod="openstack/nova-cell1-conductor-db-sync-qftr6" Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.314646 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qftr6\" (UID: \"a9215fee-a493-48f3-a67c-21fbb4256f55\") " pod="openstack/nova-cell1-conductor-db-sync-qftr6" Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.318346 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-scripts\") pod \"nova-cell1-conductor-db-sync-qftr6\" (UID: \"a9215fee-a493-48f3-a67c-21fbb4256f55\") " pod="openstack/nova-cell1-conductor-db-sync-qftr6" Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.321345 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-config-data\") pod \"nova-cell1-conductor-db-sync-qftr6\" (UID: \"a9215fee-a493-48f3-a67c-21fbb4256f55\") " pod="openstack/nova-cell1-conductor-db-sync-qftr6" Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.324078 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fss6d\" (UniqueName: \"kubernetes.io/projected/a9215fee-a493-48f3-a67c-21fbb4256f55-kube-api-access-fss6d\") pod \"nova-cell1-conductor-db-sync-qftr6\" (UID: \"a9215fee-a493-48f3-a67c-21fbb4256f55\") " pod="openstack/nova-cell1-conductor-db-sync-qftr6" Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.445567 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qftr6" Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.629587 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fwtp2" event={"ID":"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e","Type":"ContainerStarted","Data":"c012f2716641543f3e3e57641a96191662a00eb552327ca589e567ba6003f551"} Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.629899 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fwtp2" event={"ID":"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e","Type":"ContainerStarted","Data":"4fd12bf5b42179e24b25cc6a8669c914f206c22feba9cbe5076ca77af1e6d004"} Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.635491 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a","Type":"ContainerStarted","Data":"fcf2d1e2eee66f7ba52ada4d1aca41296fa4d5cb12bff4e270dd806aabe38a57"} Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.642280 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f72c2f91-025d-405a-8a67-fe72c205dda3","Type":"ContainerStarted","Data":"efccadb6d1d9fa00e19c9f342bda9fb1b1452ec43330925dfc26395522fe2d06"} Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.644773 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c59b1260-94b3-4f66-942d-d215e1c7715c","Type":"ContainerStarted","Data":"cedbbcc1bffdb90b713b804adbf7844254a09d6c42b1ab8b9ae30b98df36285f"} Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.649396 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4ac5dc21-2c35-4977-9ed8-c1af566af6d5","Type":"ContainerStarted","Data":"e7256e8aced0d99c60cb2c05e925a19cdcbe4e6fb789ee6f1b6612fe55f54181"} Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.654042 4781 generic.go:334] "Generic (PLEG): container finished" podID="9045104f-d244-43d4-850a-68e15766bd31" containerID="0d95ae09fa7e13e098dea5878b660c0f6cb0dc9aa2d9615fd43a6be972073afc" exitCode=0 Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.654133 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" event={"ID":"9045104f-d244-43d4-850a-68e15766bd31","Type":"ContainerDied","Data":"0d95ae09fa7e13e098dea5878b660c0f6cb0dc9aa2d9615fd43a6be972073afc"} Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.654164 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" event={"ID":"9045104f-d244-43d4-850a-68e15766bd31","Type":"ContainerStarted","Data":"b8e52ceac0961ddb152a960f8e02e797eb35a80fef11d57d6dafa4b4a933be16"} Dec 02 09:47:17 crc kubenswrapper[4781]: I1202 09:47:17.685514 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-fwtp2" podStartSLOduration=2.6854954429999998 podStartE2EDuration="2.685495443s" podCreationTimestamp="2025-12-02 09:47:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:47:17.67251787 +0000 UTC m=+1600.496391749" watchObservedRunningTime="2025-12-02 09:47:17.685495443 +0000 UTC m=+1600.509369322" Dec 02 09:47:18 crc kubenswrapper[4781]: I1202 09:47:18.135774 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qftr6"] Dec 02 09:47:18 crc kubenswrapper[4781]: W1202 09:47:18.146474 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9215fee_a493_48f3_a67c_21fbb4256f55.slice/crio-1ffce4a66ac18dba6098a240c5715e5ca18441bb96cdf55e337f6433524b82c7 WatchSource:0}: Error finding container 1ffce4a66ac18dba6098a240c5715e5ca18441bb96cdf55e337f6433524b82c7: Status 404 returned error can't find the container with id 1ffce4a66ac18dba6098a240c5715e5ca18441bb96cdf55e337f6433524b82c7 Dec 02 09:47:18 crc kubenswrapper[4781]: I1202 09:47:18.672574 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qftr6" event={"ID":"a9215fee-a493-48f3-a67c-21fbb4256f55","Type":"ContainerStarted","Data":"45532ac872502b7cd761561dde6edf9b00f15e4588cc2b06e6896031fbbb55ce"} Dec 02 09:47:18 crc kubenswrapper[4781]: I1202 09:47:18.672624 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qftr6" event={"ID":"a9215fee-a493-48f3-a67c-21fbb4256f55","Type":"ContainerStarted","Data":"1ffce4a66ac18dba6098a240c5715e5ca18441bb96cdf55e337f6433524b82c7"} Dec 02 09:47:18 crc kubenswrapper[4781]: I1202 09:47:18.680791 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" event={"ID":"9045104f-d244-43d4-850a-68e15766bd31","Type":"ContainerStarted","Data":"5006a11108b361c8fefabb1ec4b44136301695beac4953b4a8b8747be8007e00"} Dec 02 09:47:18 crc kubenswrapper[4781]: I1202 09:47:18.681137 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:18 crc kubenswrapper[4781]: I1202 09:47:18.708993 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" podStartSLOduration=3.7089725380000003 podStartE2EDuration="3.708972538s" podCreationTimestamp="2025-12-02 09:47:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:47:18.707808396 +0000 UTC m=+1601.531682285" watchObservedRunningTime="2025-12-02 09:47:18.708972538 +0000 UTC m=+1601.532846427" Dec 02 09:47:19 crc kubenswrapper[4781]: I1202 09:47:19.743249 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 09:47:19 crc kubenswrapper[4781]: I1202 09:47:19.761547 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:47:19 crc kubenswrapper[4781]: I1202 09:47:19.774582 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-qftr6" podStartSLOduration=2.774557584 podStartE2EDuration="2.774557584s" podCreationTimestamp="2025-12-02 09:47:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:47:19.740418268 +0000 UTC m=+1602.564292157" watchObservedRunningTime="2025-12-02 09:47:19.774557584 +0000 UTC m=+1602.598431463" Dec 02 09:47:21 crc kubenswrapper[4781]: I1202 09:47:21.739523 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4ac5dc21-2c35-4977-9ed8-c1af566af6d5","Type":"ContainerStarted","Data":"cd07165d651b1dce117b33839af42ca0d835d0228337caea6f8550ed6b595e13"} Dec 02 09:47:21 crc kubenswrapper[4781]: I1202 09:47:21.742321 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a","Type":"ContainerStarted","Data":"97b9ed60bb35c6c89587bac81f51c63e7a90c5f69f28dd0c0e327305537db1be"} Dec 02 09:47:21 crc kubenswrapper[4781]: I1202 09:47:21.742357 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a","Type":"ContainerStarted","Data":"4e640be2564cfee313d747374008eb2b2a9fb52ecb1c9eb6f1f7bc76090fe5cc"} Dec 02 09:47:21 crc kubenswrapper[4781]: I1202 09:47:21.742567 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a" containerName="nova-metadata-log" containerID="cri-o://4e640be2564cfee313d747374008eb2b2a9fb52ecb1c9eb6f1f7bc76090fe5cc" gracePeriod=30 Dec 02 09:47:21 crc kubenswrapper[4781]: I1202 09:47:21.742589 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a" containerName="nova-metadata-metadata" containerID="cri-o://97b9ed60bb35c6c89587bac81f51c63e7a90c5f69f28dd0c0e327305537db1be" gracePeriod=30 Dec 02 09:47:21 crc kubenswrapper[4781]: I1202 09:47:21.745657 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f72c2f91-025d-405a-8a67-fe72c205dda3","Type":"ContainerStarted","Data":"6b9b2a3954a76af1cbc651468cdc17f030862d0c4ac64f79707a46f71cff97eb"} Dec 02 09:47:21 crc kubenswrapper[4781]: I1202 09:47:21.745687 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f72c2f91-025d-405a-8a67-fe72c205dda3","Type":"ContainerStarted","Data":"e5a6a08ea4277e7a9dbdb745fae02870478c7f2f98529ba563faf2a92ece598c"} Dec 02 09:47:21 crc kubenswrapper[4781]: I1202 09:47:21.749185 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="c59b1260-94b3-4f66-942d-d215e1c7715c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://810b05d2e8b4be8ecf90799f992b8146399c1c08ece6e23e5d12240846e828f2" gracePeriod=30 Dec 02 09:47:21 crc kubenswrapper[4781]: I1202 09:47:21.749301 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c59b1260-94b3-4f66-942d-d215e1c7715c","Type":"ContainerStarted","Data":"810b05d2e8b4be8ecf90799f992b8146399c1c08ece6e23e5d12240846e828f2"} Dec 02 09:47:21 crc kubenswrapper[4781]: I1202 09:47:21.778604 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.969901906 podStartE2EDuration="6.778586062s" podCreationTimestamp="2025-12-02 09:47:15 +0000 UTC" firstStartedPulling="2025-12-02 09:47:17.187974944 +0000 UTC m=+1600.011848823" lastFinishedPulling="2025-12-02 09:47:20.9966591 +0000 UTC m=+1603.820532979" observedRunningTime="2025-12-02 09:47:21.763891754 +0000 UTC m=+1604.587765663" watchObservedRunningTime="2025-12-02 09:47:21.778586062 +0000 UTC m=+1604.602459941" Dec 02 09:47:21 crc kubenswrapper[4781]: I1202 09:47:21.800600 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.668094703 podStartE2EDuration="6.800580589s" podCreationTimestamp="2025-12-02 09:47:15 +0000 UTC" firstStartedPulling="2025-12-02 09:47:16.864907494 +0000 UTC m=+1599.688781373" lastFinishedPulling="2025-12-02 09:47:20.99739339 +0000 UTC m=+1603.821267259" observedRunningTime="2025-12-02 09:47:21.793496747 +0000 UTC m=+1604.617370636" watchObservedRunningTime="2025-12-02 09:47:21.800580589 +0000 UTC m=+1604.624454468" Dec 02 09:47:21 crc kubenswrapper[4781]: I1202 09:47:21.844319 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.814998392 podStartE2EDuration="6.844296823s" podCreationTimestamp="2025-12-02 09:47:15 +0000 UTC" firstStartedPulling="2025-12-02 09:47:16.967370439 +0000 UTC m=+1599.791244318" lastFinishedPulling="2025-12-02 09:47:20.99666887 +0000 UTC m=+1603.820542749" observedRunningTime="2025-12-02 09:47:21.823727715 +0000 UTC m=+1604.647601594" watchObservedRunningTime="2025-12-02 09:47:21.844296823 +0000 UTC m=+1604.668170712" Dec 02 09:47:21 crc kubenswrapper[4781]: I1202 09:47:21.847907 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.989472437 podStartE2EDuration="6.847892679s" podCreationTimestamp="2025-12-02 09:47:15 +0000 UTC" firstStartedPulling="2025-12-02 09:47:17.143535501 +0000 UTC m=+1599.967409380" lastFinishedPulling="2025-12-02 09:47:21.001955743 +0000 UTC m=+1603.825829622" observedRunningTime="2025-12-02 09:47:21.839629626 +0000 UTC m=+1604.663503505" watchObservedRunningTime="2025-12-02 09:47:21.847892679 +0000 UTC m=+1604.671766558" Dec 02 09:47:22 crc kubenswrapper[4781]: I1202 09:47:22.766208 4781 generic.go:334] "Generic (PLEG): container finished" podID="16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a" containerID="4e640be2564cfee313d747374008eb2b2a9fb52ecb1c9eb6f1f7bc76090fe5cc" exitCode=143 Dec 02 09:47:22 crc kubenswrapper[4781]: I1202 09:47:22.766900 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a","Type":"ContainerDied","Data":"4e640be2564cfee313d747374008eb2b2a9fb52ecb1c9eb6f1f7bc76090fe5cc"} Dec 02 09:47:25 crc kubenswrapper[4781]: I1202 09:47:25.626595 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9t28j"] Dec 02 09:47:25 crc kubenswrapper[4781]: I1202 09:47:25.629367 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9t28j" Dec 02 09:47:25 crc kubenswrapper[4781]: I1202 09:47:25.643279 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9t28j"] Dec 02 09:47:25 crc kubenswrapper[4781]: I1202 09:47:25.739331 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-catalog-content\") pod \"certified-operators-9t28j\" (UID: \"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1\") " pod="openshift-marketplace/certified-operators-9t28j" Dec 02 09:47:25 crc kubenswrapper[4781]: I1202 09:47:25.739380 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc5g8\" (UniqueName: \"kubernetes.io/projected/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-kube-api-access-gc5g8\") pod \"certified-operators-9t28j\" (UID: \"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1\") " pod="openshift-marketplace/certified-operators-9t28j" Dec 02 09:47:25 crc kubenswrapper[4781]: I1202 09:47:25.739474 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-utilities\") pod \"certified-operators-9t28j\" (UID: \"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1\") " pod="openshift-marketplace/certified-operators-9t28j" Dec 02 09:47:25 crc kubenswrapper[4781]: I1202 09:47:25.842374 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-utilities\") pod \"certified-operators-9t28j\" (UID: \"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1\") " pod="openshift-marketplace/certified-operators-9t28j" Dec 02 09:47:25 crc kubenswrapper[4781]: I1202 09:47:25.842568 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-catalog-content\") pod \"certified-operators-9t28j\" (UID: \"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1\") " pod="openshift-marketplace/certified-operators-9t28j" Dec 02 09:47:25 crc kubenswrapper[4781]: I1202 09:47:25.842603 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc5g8\" (UniqueName: \"kubernetes.io/projected/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-kube-api-access-gc5g8\") pod \"certified-operators-9t28j\" (UID: \"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1\") " pod="openshift-marketplace/certified-operators-9t28j" Dec 02 09:47:25 crc kubenswrapper[4781]: I1202 09:47:25.842904 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-utilities\") pod \"certified-operators-9t28j\" (UID: \"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1\") " pod="openshift-marketplace/certified-operators-9t28j" Dec 02 09:47:25 crc kubenswrapper[4781]: I1202 09:47:25.844732 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-catalog-content\") pod \"certified-operators-9t28j\" (UID: \"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1\") " pod="openshift-marketplace/certified-operators-9t28j" Dec 02 09:47:25 crc kubenswrapper[4781]: I1202 09:47:25.863722 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc5g8\" (UniqueName: \"kubernetes.io/projected/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-kube-api-access-gc5g8\") pod \"certified-operators-9t28j\" (UID: \"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1\") " pod="openshift-marketplace/certified-operators-9t28j" Dec 02 09:47:25 crc kubenswrapper[4781]: I1202 09:47:25.948745 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9t28j" Dec 02 09:47:25 crc kubenswrapper[4781]: I1202 09:47:25.960371 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:26 crc kubenswrapper[4781]: I1202 09:47:26.046649 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 09:47:26 crc kubenswrapper[4781]: I1202 09:47:26.046709 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 09:47:26 crc kubenswrapper[4781]: I1202 09:47:26.187832 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 09:47:26 crc kubenswrapper[4781]: I1202 09:47:26.187906 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 09:47:26 crc kubenswrapper[4781]: I1202 09:47:26.355056 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 09:47:26 crc kubenswrapper[4781]: I1202 09:47:26.355090 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 09:47:26 crc kubenswrapper[4781]: I1202 09:47:26.394017 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 09:47:26 crc kubenswrapper[4781]: I1202 09:47:26.403885 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:47:26 crc kubenswrapper[4781]: I1202 09:47:26.484362 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9t28j"] Dec 02 09:47:26 crc kubenswrapper[4781]: I1202 09:47:26.510191 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lqtxt"] Dec 02 09:47:26 crc kubenswrapper[4781]: I1202 09:47:26.510477 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" podUID="4007cb57-06af-4119-bf9e-a2e2aa8509cd" containerName="dnsmasq-dns" containerID="cri-o://f541f64d2be699a6fa47b38e1efc60aa3cc5fc6eeda0d14bf712ec07ee964e40" gracePeriod=10 Dec 02 09:47:26 crc kubenswrapper[4781]: I1202 09:47:26.806019 4781 generic.go:334] "Generic (PLEG): container finished" podID="f5c50078-d02c-43a0-83b1-d92b6f5a6e0e" containerID="c012f2716641543f3e3e57641a96191662a00eb552327ca589e567ba6003f551" exitCode=0 Dec 02 09:47:26 crc kubenswrapper[4781]: I1202 09:47:26.806101 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fwtp2" event={"ID":"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e","Type":"ContainerDied","Data":"c012f2716641543f3e3e57641a96191662a00eb552327ca589e567ba6003f551"} Dec 02 09:47:26 crc kubenswrapper[4781]: I1202 09:47:26.807820 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9t28j" event={"ID":"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1","Type":"ContainerStarted","Data":"822b0f87ea8f6586ba7228d19c8e01cb7a360c2af60ba74b162f72578e2eb8fc"} Dec 02 09:47:26 crc kubenswrapper[4781]: I1202 09:47:26.839492 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.130095 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f72c2f91-025d-405a-8a67-fe72c205dda3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.130096 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f72c2f91-025d-405a-8a67-fe72c205dda3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.628161 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.789126 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-dns-swift-storage-0\") pod \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.789164 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-dns-svc\") pod \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.789411 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwgpg\" (UniqueName: \"kubernetes.io/projected/4007cb57-06af-4119-bf9e-a2e2aa8509cd-kube-api-access-pwgpg\") pod \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.789435 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-ovsdbserver-nb\") pod \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.789530 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-config\") pod \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.789901 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-ovsdbserver-sb\") pod \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\" (UID: \"4007cb57-06af-4119-bf9e-a2e2aa8509cd\") " Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.794183 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4007cb57-06af-4119-bf9e-a2e2aa8509cd-kube-api-access-pwgpg" (OuterVolumeSpecName: "kube-api-access-pwgpg") pod "4007cb57-06af-4119-bf9e-a2e2aa8509cd" (UID: "4007cb57-06af-4119-bf9e-a2e2aa8509cd"). InnerVolumeSpecName "kube-api-access-pwgpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.835272 4781 generic.go:334] "Generic (PLEG): container finished" podID="4007cb57-06af-4119-bf9e-a2e2aa8509cd" containerID="f541f64d2be699a6fa47b38e1efc60aa3cc5fc6eeda0d14bf712ec07ee964e40" exitCode=0 Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.835306 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.835330 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" event={"ID":"4007cb57-06af-4119-bf9e-a2e2aa8509cd","Type":"ContainerDied","Data":"f541f64d2be699a6fa47b38e1efc60aa3cc5fc6eeda0d14bf712ec07ee964e40"} Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.837065 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-lqtxt" event={"ID":"4007cb57-06af-4119-bf9e-a2e2aa8509cd","Type":"ContainerDied","Data":"37e2ce29ae4b30ec8644dbb833531a95ede00720d3361708dcfb876227092abe"} Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.837104 4781 scope.go:117] "RemoveContainer" containerID="f541f64d2be699a6fa47b38e1efc60aa3cc5fc6eeda0d14bf712ec07ee964e40" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.840316 4781 generic.go:334] "Generic (PLEG): container finished" podID="bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1" containerID="816dcd4f6e147cd7b6f262132e53efda737a302fc411362ac395e30030c70698" exitCode=0 Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.840433 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9t28j" event={"ID":"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1","Type":"ContainerDied","Data":"816dcd4f6e147cd7b6f262132e53efda737a302fc411362ac395e30030c70698"} Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.873473 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4007cb57-06af-4119-bf9e-a2e2aa8509cd" (UID: "4007cb57-06af-4119-bf9e-a2e2aa8509cd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.878322 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4007cb57-06af-4119-bf9e-a2e2aa8509cd" (UID: "4007cb57-06af-4119-bf9e-a2e2aa8509cd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.882186 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-config" (OuterVolumeSpecName: "config") pod "4007cb57-06af-4119-bf9e-a2e2aa8509cd" (UID: "4007cb57-06af-4119-bf9e-a2e2aa8509cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.893847 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwgpg\" (UniqueName: \"kubernetes.io/projected/4007cb57-06af-4119-bf9e-a2e2aa8509cd-kube-api-access-pwgpg\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.893884 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.893895 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.893907 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.913437 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4007cb57-06af-4119-bf9e-a2e2aa8509cd" (UID: "4007cb57-06af-4119-bf9e-a2e2aa8509cd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.913622 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4007cb57-06af-4119-bf9e-a2e2aa8509cd" (UID: "4007cb57-06af-4119-bf9e-a2e2aa8509cd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.968591 4781 scope.go:117] "RemoveContainer" containerID="2f87f7693723109ba430fcd9f33232231cceb212e84d31e5fc709dc6e794e5ae" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.998406 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:27 crc kubenswrapper[4781]: I1202 09:47:27.998486 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4007cb57-06af-4119-bf9e-a2e2aa8509cd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.004159 4781 scope.go:117] "RemoveContainer" containerID="f541f64d2be699a6fa47b38e1efc60aa3cc5fc6eeda0d14bf712ec07ee964e40" Dec 02 09:47:28 crc kubenswrapper[4781]: E1202 09:47:28.020594 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f541f64d2be699a6fa47b38e1efc60aa3cc5fc6eeda0d14bf712ec07ee964e40\": container with ID starting with f541f64d2be699a6fa47b38e1efc60aa3cc5fc6eeda0d14bf712ec07ee964e40 not found: ID does not exist" containerID="f541f64d2be699a6fa47b38e1efc60aa3cc5fc6eeda0d14bf712ec07ee964e40" Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.020663 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f541f64d2be699a6fa47b38e1efc60aa3cc5fc6eeda0d14bf712ec07ee964e40"} err="failed to get container status \"f541f64d2be699a6fa47b38e1efc60aa3cc5fc6eeda0d14bf712ec07ee964e40\": rpc error: code = NotFound desc = could not find container \"f541f64d2be699a6fa47b38e1efc60aa3cc5fc6eeda0d14bf712ec07ee964e40\": container with ID starting with f541f64d2be699a6fa47b38e1efc60aa3cc5fc6eeda0d14bf712ec07ee964e40 not found: ID does not exist" Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.020695 4781 scope.go:117] "RemoveContainer" containerID="2f87f7693723109ba430fcd9f33232231cceb212e84d31e5fc709dc6e794e5ae" Dec 02 09:47:28 crc kubenswrapper[4781]: E1202 09:47:28.021652 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f87f7693723109ba430fcd9f33232231cceb212e84d31e5fc709dc6e794e5ae\": container with ID starting with 2f87f7693723109ba430fcd9f33232231cceb212e84d31e5fc709dc6e794e5ae not found: ID does not exist" containerID="2f87f7693723109ba430fcd9f33232231cceb212e84d31e5fc709dc6e794e5ae" Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.021703 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f87f7693723109ba430fcd9f33232231cceb212e84d31e5fc709dc6e794e5ae"} err="failed to get container status \"2f87f7693723109ba430fcd9f33232231cceb212e84d31e5fc709dc6e794e5ae\": rpc error: code = NotFound desc = could not find container \"2f87f7693723109ba430fcd9f33232231cceb212e84d31e5fc709dc6e794e5ae\": container with ID starting with 2f87f7693723109ba430fcd9f33232231cceb212e84d31e5fc709dc6e794e5ae not found: ID does not exist" Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.240370 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lqtxt"] Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.264023 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-lqtxt"] Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.277607 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fwtp2" Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.410736 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjb9l\" (UniqueName: \"kubernetes.io/projected/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-kube-api-access-zjb9l\") pod \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\" (UID: \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\") " Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.410843 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-scripts\") pod \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\" (UID: \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\") " Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.411040 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-config-data\") pod \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\" (UID: \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\") " Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.411153 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-combined-ca-bundle\") pod \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\" (UID: \"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e\") " Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.415618 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-kube-api-access-zjb9l" (OuterVolumeSpecName: "kube-api-access-zjb9l") pod "f5c50078-d02c-43a0-83b1-d92b6f5a6e0e" (UID: "f5c50078-d02c-43a0-83b1-d92b6f5a6e0e"). InnerVolumeSpecName "kube-api-access-zjb9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.415638 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-scripts" (OuterVolumeSpecName: "scripts") pod "f5c50078-d02c-43a0-83b1-d92b6f5a6e0e" (UID: "f5c50078-d02c-43a0-83b1-d92b6f5a6e0e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.441016 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5c50078-d02c-43a0-83b1-d92b6f5a6e0e" (UID: "f5c50078-d02c-43a0-83b1-d92b6f5a6e0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.441087 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-config-data" (OuterVolumeSpecName: "config-data") pod "f5c50078-d02c-43a0-83b1-d92b6f5a6e0e" (UID: "f5c50078-d02c-43a0-83b1-d92b6f5a6e0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.512884 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.513217 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.513232 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjb9l\" (UniqueName: \"kubernetes.io/projected/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-kube-api-access-zjb9l\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.513244 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.854084 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fwtp2" event={"ID":"f5c50078-d02c-43a0-83b1-d92b6f5a6e0e","Type":"ContainerDied","Data":"4fd12bf5b42179e24b25cc6a8669c914f206c22feba9cbe5076ca77af1e6d004"} Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.854123 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fd12bf5b42179e24b25cc6a8669c914f206c22feba9cbe5076ca77af1e6d004" Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.854187 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fwtp2" Dec 02 09:47:28 crc kubenswrapper[4781]: I1202 09:47:28.860968 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9t28j" event={"ID":"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1","Type":"ContainerStarted","Data":"a07f2c8719be7b238972b19d0788250771aeb593492d7ee9dd6ea04e274dd11f"} Dec 02 09:47:29 crc kubenswrapper[4781]: I1202 09:47:29.031031 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:47:29 crc kubenswrapper[4781]: I1202 09:47:29.031307 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f72c2f91-025d-405a-8a67-fe72c205dda3" containerName="nova-api-log" containerID="cri-o://e5a6a08ea4277e7a9dbdb745fae02870478c7f2f98529ba563faf2a92ece598c" gracePeriod=30 Dec 02 09:47:29 crc kubenswrapper[4781]: I1202 09:47:29.031393 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f72c2f91-025d-405a-8a67-fe72c205dda3" containerName="nova-api-api" containerID="cri-o://6b9b2a3954a76af1cbc651468cdc17f030862d0c4ac64f79707a46f71cff97eb" gracePeriod=30 Dec 02 09:47:29 crc kubenswrapper[4781]: I1202 09:47:29.044488 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:47:29 crc kubenswrapper[4781]: I1202 09:47:29.044971 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4ac5dc21-2c35-4977-9ed8-c1af566af6d5" containerName="nova-scheduler-scheduler" containerID="cri-o://cd07165d651b1dce117b33839af42ca0d835d0228337caea6f8550ed6b595e13" gracePeriod=30 Dec 02 09:47:29 crc kubenswrapper[4781]: I1202 09:47:29.520875 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4007cb57-06af-4119-bf9e-a2e2aa8509cd" path="/var/lib/kubelet/pods/4007cb57-06af-4119-bf9e-a2e2aa8509cd/volumes" Dec 02 09:47:29 crc kubenswrapper[4781]: I1202 09:47:29.872070 4781 generic.go:334] "Generic (PLEG): container finished" podID="bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1" containerID="a07f2c8719be7b238972b19d0788250771aeb593492d7ee9dd6ea04e274dd11f" exitCode=0 Dec 02 09:47:29 crc kubenswrapper[4781]: I1202 09:47:29.872138 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9t28j" event={"ID":"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1","Type":"ContainerDied","Data":"a07f2c8719be7b238972b19d0788250771aeb593492d7ee9dd6ea04e274dd11f"} Dec 02 09:47:29 crc kubenswrapper[4781]: I1202 09:47:29.874934 4781 generic.go:334] "Generic (PLEG): container finished" podID="f72c2f91-025d-405a-8a67-fe72c205dda3" containerID="e5a6a08ea4277e7a9dbdb745fae02870478c7f2f98529ba563faf2a92ece598c" exitCode=143 Dec 02 09:47:29 crc kubenswrapper[4781]: I1202 09:47:29.874967 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f72c2f91-025d-405a-8a67-fe72c205dda3","Type":"ContainerDied","Data":"e5a6a08ea4277e7a9dbdb745fae02870478c7f2f98529ba563faf2a92ece598c"} Dec 02 09:47:30 crc kubenswrapper[4781]: I1202 09:47:30.412039 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:47:30 crc kubenswrapper[4781]: I1202 09:47:30.412129 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:47:31 crc kubenswrapper[4781]: E1202 09:47:31.359610 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cd07165d651b1dce117b33839af42ca0d835d0228337caea6f8550ed6b595e13" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 09:47:31 crc kubenswrapper[4781]: E1202 09:47:31.361741 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cd07165d651b1dce117b33839af42ca0d835d0228337caea6f8550ed6b595e13" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 09:47:31 crc kubenswrapper[4781]: E1202 09:47:31.365965 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cd07165d651b1dce117b33839af42ca0d835d0228337caea6f8550ed6b595e13" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 09:47:31 crc kubenswrapper[4781]: E1202 09:47:31.366043 4781 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4ac5dc21-2c35-4977-9ed8-c1af566af6d5" containerName="nova-scheduler-scheduler" Dec 02 09:47:31 crc kubenswrapper[4781]: I1202 09:47:31.897024 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9t28j" event={"ID":"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1","Type":"ContainerStarted","Data":"8647453b2102fa649b05023dade2b39b33f6485b73e04331b7e2c73f96181914"} Dec 02 09:47:31 crc kubenswrapper[4781]: I1202 09:47:31.919982 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9t28j" podStartSLOduration=3.8075363920000003 podStartE2EDuration="6.919956555s" podCreationTimestamp="2025-12-02 09:47:25 +0000 UTC" firstStartedPulling="2025-12-02 09:47:27.844212706 +0000 UTC m=+1610.668086585" lastFinishedPulling="2025-12-02 09:47:30.956632869 +0000 UTC m=+1613.780506748" observedRunningTime="2025-12-02 09:47:31.914667831 +0000 UTC m=+1614.738541710" watchObservedRunningTime="2025-12-02 09:47:31.919956555 +0000 UTC m=+1614.743830434" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.537422 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.605865 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-config-data\") pod \"4ac5dc21-2c35-4977-9ed8-c1af566af6d5\" (UID: \"4ac5dc21-2c35-4977-9ed8-c1af566af6d5\") " Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.606260 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-combined-ca-bundle\") pod \"4ac5dc21-2c35-4977-9ed8-c1af566af6d5\" (UID: \"4ac5dc21-2c35-4977-9ed8-c1af566af6d5\") " Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.606321 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6n2w\" (UniqueName: \"kubernetes.io/projected/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-kube-api-access-g6n2w\") pod \"4ac5dc21-2c35-4977-9ed8-c1af566af6d5\" (UID: \"4ac5dc21-2c35-4977-9ed8-c1af566af6d5\") " Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.613103 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-kube-api-access-g6n2w" (OuterVolumeSpecName: "kube-api-access-g6n2w") pod "4ac5dc21-2c35-4977-9ed8-c1af566af6d5" (UID: "4ac5dc21-2c35-4977-9ed8-c1af566af6d5"). InnerVolumeSpecName "kube-api-access-g6n2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.643834 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-config-data" (OuterVolumeSpecName: "config-data") pod "4ac5dc21-2c35-4977-9ed8-c1af566af6d5" (UID: "4ac5dc21-2c35-4977-9ed8-c1af566af6d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.654537 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ac5dc21-2c35-4977-9ed8-c1af566af6d5" (UID: "4ac5dc21-2c35-4977-9ed8-c1af566af6d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.708297 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.708332 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.708344 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6n2w\" (UniqueName: \"kubernetes.io/projected/4ac5dc21-2c35-4977-9ed8-c1af566af6d5-kube-api-access-g6n2w\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.915159 4781 generic.go:334] "Generic (PLEG): container finished" podID="4ac5dc21-2c35-4977-9ed8-c1af566af6d5" containerID="cd07165d651b1dce117b33839af42ca0d835d0228337caea6f8550ed6b595e13" exitCode=0 Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.915217 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4ac5dc21-2c35-4977-9ed8-c1af566af6d5","Type":"ContainerDied","Data":"cd07165d651b1dce117b33839af42ca0d835d0228337caea6f8550ed6b595e13"} Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.915264 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.915284 4781 scope.go:117] "RemoveContainer" containerID="cd07165d651b1dce117b33839af42ca0d835d0228337caea6f8550ed6b595e13" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.915272 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4ac5dc21-2c35-4977-9ed8-c1af566af6d5","Type":"ContainerDied","Data":"e7256e8aced0d99c60cb2c05e925a19cdcbe4e6fb789ee6f1b6612fe55f54181"} Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.917219 4781 generic.go:334] "Generic (PLEG): container finished" podID="a9215fee-a493-48f3-a67c-21fbb4256f55" containerID="45532ac872502b7cd761561dde6edf9b00f15e4588cc2b06e6896031fbbb55ce" exitCode=0 Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.917251 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qftr6" event={"ID":"a9215fee-a493-48f3-a67c-21fbb4256f55","Type":"ContainerDied","Data":"45532ac872502b7cd761561dde6edf9b00f15e4588cc2b06e6896031fbbb55ce"} Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.920105 4781 generic.go:334] "Generic (PLEG): container finished" podID="f72c2f91-025d-405a-8a67-fe72c205dda3" containerID="6b9b2a3954a76af1cbc651468cdc17f030862d0c4ac64f79707a46f71cff97eb" exitCode=0 Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.920142 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f72c2f91-025d-405a-8a67-fe72c205dda3","Type":"ContainerDied","Data":"6b9b2a3954a76af1cbc651468cdc17f030862d0c4ac64f79707a46f71cff97eb"} Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.940978 4781 scope.go:117] "RemoveContainer" containerID="cd07165d651b1dce117b33839af42ca0d835d0228337caea6f8550ed6b595e13" Dec 02 09:47:33 crc kubenswrapper[4781]: E1202 09:47:33.941409 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd07165d651b1dce117b33839af42ca0d835d0228337caea6f8550ed6b595e13\": container with ID starting with cd07165d651b1dce117b33839af42ca0d835d0228337caea6f8550ed6b595e13 not found: ID does not exist" containerID="cd07165d651b1dce117b33839af42ca0d835d0228337caea6f8550ed6b595e13" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.941446 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd07165d651b1dce117b33839af42ca0d835d0228337caea6f8550ed6b595e13"} err="failed to get container status \"cd07165d651b1dce117b33839af42ca0d835d0228337caea6f8550ed6b595e13\": rpc error: code = NotFound desc = could not find container \"cd07165d651b1dce117b33839af42ca0d835d0228337caea6f8550ed6b595e13\": container with ID starting with cd07165d651b1dce117b33839af42ca0d835d0228337caea6f8550ed6b595e13 not found: ID does not exist" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.956834 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.965997 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.975519 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:47:33 crc kubenswrapper[4781]: E1202 09:47:33.976368 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4007cb57-06af-4119-bf9e-a2e2aa8509cd" containerName="dnsmasq-dns" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.976391 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4007cb57-06af-4119-bf9e-a2e2aa8509cd" containerName="dnsmasq-dns" Dec 02 09:47:33 crc kubenswrapper[4781]: E1202 09:47:33.976427 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4007cb57-06af-4119-bf9e-a2e2aa8509cd" containerName="init" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.976436 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4007cb57-06af-4119-bf9e-a2e2aa8509cd" containerName="init" Dec 02 09:47:33 crc kubenswrapper[4781]: E1202 09:47:33.976462 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c50078-d02c-43a0-83b1-d92b6f5a6e0e" containerName="nova-manage" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.976472 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c50078-d02c-43a0-83b1-d92b6f5a6e0e" containerName="nova-manage" Dec 02 09:47:33 crc kubenswrapper[4781]: E1202 09:47:33.976483 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac5dc21-2c35-4977-9ed8-c1af566af6d5" containerName="nova-scheduler-scheduler" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.976490 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac5dc21-2c35-4977-9ed8-c1af566af6d5" containerName="nova-scheduler-scheduler" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.976691 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ac5dc21-2c35-4977-9ed8-c1af566af6d5" containerName="nova-scheduler-scheduler" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.976711 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5c50078-d02c-43a0-83b1-d92b6f5a6e0e" containerName="nova-manage" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.976735 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4007cb57-06af-4119-bf9e-a2e2aa8509cd" containerName="dnsmasq-dns" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.977540 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.979869 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 09:47:33 crc kubenswrapper[4781]: I1202 09:47:33.996623 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.012801 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-config-data\") pod \"nova-scheduler-0\" (UID: \"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.013007 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.013091 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2mmb\" (UniqueName: \"kubernetes.io/projected/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-kube-api-access-l2mmb\") pod \"nova-scheduler-0\" (UID: \"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.113903 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.114013 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2mmb\" (UniqueName: \"kubernetes.io/projected/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-kube-api-access-l2mmb\") pod \"nova-scheduler-0\" (UID: \"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.114047 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-config-data\") pod \"nova-scheduler-0\" (UID: \"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.119169 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-config-data\") pod \"nova-scheduler-0\" (UID: \"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.121797 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.133485 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2mmb\" (UniqueName: \"kubernetes.io/projected/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-kube-api-access-l2mmb\") pod \"nova-scheduler-0\" (UID: \"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af\") " pod="openstack/nova-scheduler-0" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.296133 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.490315 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.626451 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f72c2f91-025d-405a-8a67-fe72c205dda3-combined-ca-bundle\") pod \"f72c2f91-025d-405a-8a67-fe72c205dda3\" (UID: \"f72c2f91-025d-405a-8a67-fe72c205dda3\") " Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.626777 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f72c2f91-025d-405a-8a67-fe72c205dda3-logs\") pod \"f72c2f91-025d-405a-8a67-fe72c205dda3\" (UID: \"f72c2f91-025d-405a-8a67-fe72c205dda3\") " Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.626820 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hxxl\" (UniqueName: \"kubernetes.io/projected/f72c2f91-025d-405a-8a67-fe72c205dda3-kube-api-access-5hxxl\") pod \"f72c2f91-025d-405a-8a67-fe72c205dda3\" (UID: \"f72c2f91-025d-405a-8a67-fe72c205dda3\") " Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.626886 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f72c2f91-025d-405a-8a67-fe72c205dda3-config-data\") pod \"f72c2f91-025d-405a-8a67-fe72c205dda3\" (UID: \"f72c2f91-025d-405a-8a67-fe72c205dda3\") " Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.630803 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f72c2f91-025d-405a-8a67-fe72c205dda3-logs" (OuterVolumeSpecName: "logs") pod "f72c2f91-025d-405a-8a67-fe72c205dda3" (UID: "f72c2f91-025d-405a-8a67-fe72c205dda3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.649391 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f72c2f91-025d-405a-8a67-fe72c205dda3-kube-api-access-5hxxl" (OuterVolumeSpecName: "kube-api-access-5hxxl") pod "f72c2f91-025d-405a-8a67-fe72c205dda3" (UID: "f72c2f91-025d-405a-8a67-fe72c205dda3"). InnerVolumeSpecName "kube-api-access-5hxxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.728637 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f72c2f91-025d-405a-8a67-fe72c205dda3-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.728678 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hxxl\" (UniqueName: \"kubernetes.io/projected/f72c2f91-025d-405a-8a67-fe72c205dda3-kube-api-access-5hxxl\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.735105 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f72c2f91-025d-405a-8a67-fe72c205dda3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f72c2f91-025d-405a-8a67-fe72c205dda3" (UID: "f72c2f91-025d-405a-8a67-fe72c205dda3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.768409 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f72c2f91-025d-405a-8a67-fe72c205dda3-config-data" (OuterVolumeSpecName: "config-data") pod "f72c2f91-025d-405a-8a67-fe72c205dda3" (UID: "f72c2f91-025d-405a-8a67-fe72c205dda3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.831820 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f72c2f91-025d-405a-8a67-fe72c205dda3-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.831867 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f72c2f91-025d-405a-8a67-fe72c205dda3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.934277 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.935055 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f72c2f91-025d-405a-8a67-fe72c205dda3","Type":"ContainerDied","Data":"efccadb6d1d9fa00e19c9f342bda9fb1b1452ec43330925dfc26395522fe2d06"} Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.935114 4781 scope.go:117] "RemoveContainer" containerID="6b9b2a3954a76af1cbc651468cdc17f030862d0c4ac64f79707a46f71cff97eb" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.939446 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.961443 4781 scope.go:117] "RemoveContainer" containerID="e5a6a08ea4277e7a9dbdb745fae02870478c7f2f98529ba563faf2a92ece598c" Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.986054 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:47:34 crc kubenswrapper[4781]: I1202 09:47:34.998241 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.010603 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 09:47:35 crc kubenswrapper[4781]: E1202 09:47:35.011043 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f72c2f91-025d-405a-8a67-fe72c205dda3" containerName="nova-api-log" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.011068 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f72c2f91-025d-405a-8a67-fe72c205dda3" containerName="nova-api-log" Dec 02 09:47:35 crc kubenswrapper[4781]: E1202 09:47:35.011085 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f72c2f91-025d-405a-8a67-fe72c205dda3" containerName="nova-api-api" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.011092 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f72c2f91-025d-405a-8a67-fe72c205dda3" containerName="nova-api-api" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.011308 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f72c2f91-025d-405a-8a67-fe72c205dda3" containerName="nova-api-log" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.011342 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f72c2f91-025d-405a-8a67-fe72c205dda3" containerName="nova-api-api" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.012764 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.018853 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.023119 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.050181 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-logs\") pod \"nova-api-0\" (UID: \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\") " pod="openstack/nova-api-0" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.050265 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\") " pod="openstack/nova-api-0" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.050295 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8kxz\" (UniqueName: \"kubernetes.io/projected/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-kube-api-access-r8kxz\") pod \"nova-api-0\" (UID: \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\") " pod="openstack/nova-api-0" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.050328 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-config-data\") pod \"nova-api-0\" (UID: \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\") " pod="openstack/nova-api-0" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.151904 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-logs\") pod \"nova-api-0\" (UID: \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\") " pod="openstack/nova-api-0" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.152015 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\") " pod="openstack/nova-api-0" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.152051 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8kxz\" (UniqueName: \"kubernetes.io/projected/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-kube-api-access-r8kxz\") pod \"nova-api-0\" (UID: \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\") " pod="openstack/nova-api-0" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.152091 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-config-data\") pod \"nova-api-0\" (UID: \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\") " pod="openstack/nova-api-0" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.152355 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-logs\") pod \"nova-api-0\" (UID: \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\") " pod="openstack/nova-api-0" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.157024 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-config-data\") pod \"nova-api-0\" (UID: \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\") " pod="openstack/nova-api-0" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.158113 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\") " pod="openstack/nova-api-0" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.172813 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8kxz\" (UniqueName: \"kubernetes.io/projected/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-kube-api-access-r8kxz\") pod \"nova-api-0\" (UID: \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\") " pod="openstack/nova-api-0" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.347585 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.420062 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qftr6" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.511248 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ac5dc21-2c35-4977-9ed8-c1af566af6d5" path="/var/lib/kubelet/pods/4ac5dc21-2c35-4977-9ed8-c1af566af6d5/volumes" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.511808 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f72c2f91-025d-405a-8a67-fe72c205dda3" path="/var/lib/kubelet/pods/f72c2f91-025d-405a-8a67-fe72c205dda3/volumes" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.558620 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-config-data\") pod \"a9215fee-a493-48f3-a67c-21fbb4256f55\" (UID: \"a9215fee-a493-48f3-a67c-21fbb4256f55\") " Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.558716 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-scripts\") pod \"a9215fee-a493-48f3-a67c-21fbb4256f55\" (UID: \"a9215fee-a493-48f3-a67c-21fbb4256f55\") " Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.558750 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-combined-ca-bundle\") pod \"a9215fee-a493-48f3-a67c-21fbb4256f55\" (UID: \"a9215fee-a493-48f3-a67c-21fbb4256f55\") " Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.558892 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fss6d\" (UniqueName: \"kubernetes.io/projected/a9215fee-a493-48f3-a67c-21fbb4256f55-kube-api-access-fss6d\") pod \"a9215fee-a493-48f3-a67c-21fbb4256f55\" (UID: \"a9215fee-a493-48f3-a67c-21fbb4256f55\") " Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.566092 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-scripts" (OuterVolumeSpecName: "scripts") pod "a9215fee-a493-48f3-a67c-21fbb4256f55" (UID: "a9215fee-a493-48f3-a67c-21fbb4256f55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.566094 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9215fee-a493-48f3-a67c-21fbb4256f55-kube-api-access-fss6d" (OuterVolumeSpecName: "kube-api-access-fss6d") pod "a9215fee-a493-48f3-a67c-21fbb4256f55" (UID: "a9215fee-a493-48f3-a67c-21fbb4256f55"). InnerVolumeSpecName "kube-api-access-fss6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.593930 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9215fee-a493-48f3-a67c-21fbb4256f55" (UID: "a9215fee-a493-48f3-a67c-21fbb4256f55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.599528 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-config-data" (OuterVolumeSpecName: "config-data") pod "a9215fee-a493-48f3-a67c-21fbb4256f55" (UID: "a9215fee-a493-48f3-a67c-21fbb4256f55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.662521 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.662551 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.662561 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fss6d\" (UniqueName: \"kubernetes.io/projected/a9215fee-a493-48f3-a67c-21fbb4256f55-kube-api-access-fss6d\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.662569 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9215fee-a493-48f3-a67c-21fbb4256f55-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.814814 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.949131 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9t28j" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.949287 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9t28j" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.956219 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af","Type":"ContainerStarted","Data":"9c5e32171b18f60b0a0543666ce99fdfae3aad608a03d90c9b2bf411e50e7169"} Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.956284 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af","Type":"ContainerStarted","Data":"e6d99c1d80c8ff13e675cebc9f3364728e06b28f25777c871cd86c62bd1d13f2"} Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.983862 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qftr6" event={"ID":"a9215fee-a493-48f3-a67c-21fbb4256f55","Type":"ContainerDied","Data":"1ffce4a66ac18dba6098a240c5715e5ca18441bb96cdf55e337f6433524b82c7"} Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.983910 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ffce4a66ac18dba6098a240c5715e5ca18441bb96cdf55e337f6433524b82c7" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.984060 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qftr6" Dec 02 09:47:35 crc kubenswrapper[4781]: I1202 09:47:35.994369 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d","Type":"ContainerStarted","Data":"3c4eaaf1abd9b3b9c92cefacc1774cc2530c9e2a2d2dbf59ab7af452f054d713"} Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.012723 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.012687014 podStartE2EDuration="3.012687014s" podCreationTimestamp="2025-12-02 09:47:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:47:35.976460583 +0000 UTC m=+1618.800334462" watchObservedRunningTime="2025-12-02 09:47:36.012687014 +0000 UTC m=+1618.836560893" Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.037943 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 09:47:36 crc kubenswrapper[4781]: E1202 09:47:36.039270 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9215fee-a493-48f3-a67c-21fbb4256f55" containerName="nova-cell1-conductor-db-sync" Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.039297 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9215fee-a493-48f3-a67c-21fbb4256f55" containerName="nova-cell1-conductor-db-sync" Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.039549 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9215fee-a493-48f3-a67c-21fbb4256f55" containerName="nova-cell1-conductor-db-sync" Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.040186 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.040826 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9t28j" Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.043827 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.066541 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.173834 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde74fe6-799b-4da8-974d-3fefd2af69aa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cde74fe6-799b-4da8-974d-3fefd2af69aa\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.174271 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde74fe6-799b-4da8-974d-3fefd2af69aa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cde74fe6-799b-4da8-974d-3fefd2af69aa\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.174321 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhl4s\" (UniqueName: \"kubernetes.io/projected/cde74fe6-799b-4da8-974d-3fefd2af69aa-kube-api-access-mhl4s\") pod \"nova-cell1-conductor-0\" (UID: \"cde74fe6-799b-4da8-974d-3fefd2af69aa\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.275618 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde74fe6-799b-4da8-974d-3fefd2af69aa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cde74fe6-799b-4da8-974d-3fefd2af69aa\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.275683 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhl4s\" (UniqueName: \"kubernetes.io/projected/cde74fe6-799b-4da8-974d-3fefd2af69aa-kube-api-access-mhl4s\") pod \"nova-cell1-conductor-0\" (UID: \"cde74fe6-799b-4da8-974d-3fefd2af69aa\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.275808 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde74fe6-799b-4da8-974d-3fefd2af69aa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cde74fe6-799b-4da8-974d-3fefd2af69aa\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.285676 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde74fe6-799b-4da8-974d-3fefd2af69aa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cde74fe6-799b-4da8-974d-3fefd2af69aa\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.285732 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde74fe6-799b-4da8-974d-3fefd2af69aa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cde74fe6-799b-4da8-974d-3fefd2af69aa\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.292888 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhl4s\" (UniqueName: \"kubernetes.io/projected/cde74fe6-799b-4da8-974d-3fefd2af69aa-kube-api-access-mhl4s\") pod \"nova-cell1-conductor-0\" (UID: \"cde74fe6-799b-4da8-974d-3fefd2af69aa\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:47:36 crc kubenswrapper[4781]: I1202 09:47:36.502058 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 09:47:37 crc kubenswrapper[4781]: I1202 09:47:37.002772 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 09:47:37 crc kubenswrapper[4781]: I1202 09:47:37.005683 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d","Type":"ContainerStarted","Data":"478cd18c4c1a4759a237a68b2bac6ad5cd6591aa5d48537bf10f8ec5257aec34"} Dec 02 09:47:37 crc kubenswrapper[4781]: I1202 09:47:37.005729 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d","Type":"ContainerStarted","Data":"1e9c253db3d4499c4421a0be29f78f56f901c48e43e105abff413d540f00a655"} Dec 02 09:47:37 crc kubenswrapper[4781]: I1202 09:47:37.082652 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9t28j" Dec 02 09:47:37 crc kubenswrapper[4781]: I1202 09:47:37.103738 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.103713269 podStartE2EDuration="3.103713269s" podCreationTimestamp="2025-12-02 09:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:47:37.033408125 +0000 UTC m=+1619.857282004" watchObservedRunningTime="2025-12-02 09:47:37.103713269 +0000 UTC m=+1619.927587158" Dec 02 09:47:37 crc kubenswrapper[4781]: I1202 09:47:37.138110 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9t28j"] Dec 02 09:47:37 crc kubenswrapper[4781]: I1202 09:47:37.842466 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 09:47:38 crc kubenswrapper[4781]: I1202 09:47:38.015914 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cde74fe6-799b-4da8-974d-3fefd2af69aa","Type":"ContainerStarted","Data":"8719b4cef8e137d7aadb69db4c1e1b2519cbc1bf9e1ced6ee5962621c36c6a24"} Dec 02 09:47:38 crc kubenswrapper[4781]: I1202 09:47:38.015971 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cde74fe6-799b-4da8-974d-3fefd2af69aa","Type":"ContainerStarted","Data":"48cb87889775b59d29cc1921cac8cc918c6cc9c31ed4a70fcb814641fd78eeae"} Dec 02 09:47:38 crc kubenswrapper[4781]: I1202 09:47:38.042240 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.042219523 podStartE2EDuration="2.042219523s" podCreationTimestamp="2025-12-02 09:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:47:38.037465304 +0000 UTC m=+1620.861339183" watchObservedRunningTime="2025-12-02 09:47:38.042219523 +0000 UTC m=+1620.866093392" Dec 02 09:47:39 crc kubenswrapper[4781]: I1202 09:47:39.023388 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9t28j" podUID="bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1" containerName="registry-server" containerID="cri-o://8647453b2102fa649b05023dade2b39b33f6485b73e04331b7e2c73f96181914" gracePeriod=2 Dec 02 09:47:39 crc kubenswrapper[4781]: I1202 09:47:39.023694 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 02 09:47:39 crc kubenswrapper[4781]: I1202 09:47:39.297026 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 09:47:39 crc kubenswrapper[4781]: I1202 09:47:39.467488 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9t28j" Dec 02 09:47:39 crc kubenswrapper[4781]: I1202 09:47:39.637945 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-utilities\") pod \"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1\" (UID: \"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1\") " Dec 02 09:47:39 crc kubenswrapper[4781]: I1202 09:47:39.638040 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc5g8\" (UniqueName: \"kubernetes.io/projected/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-kube-api-access-gc5g8\") pod \"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1\" (UID: \"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1\") " Dec 02 09:47:39 crc kubenswrapper[4781]: I1202 09:47:39.638108 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-catalog-content\") pod \"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1\" (UID: \"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1\") " Dec 02 09:47:39 crc kubenswrapper[4781]: I1202 09:47:39.638843 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-utilities" (OuterVolumeSpecName: "utilities") pod "bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1" (UID: "bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:47:39 crc kubenswrapper[4781]: I1202 09:47:39.639114 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:39 crc kubenswrapper[4781]: I1202 09:47:39.653495 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-kube-api-access-gc5g8" (OuterVolumeSpecName: "kube-api-access-gc5g8") pod "bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1" (UID: "bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1"). InnerVolumeSpecName "kube-api-access-gc5g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:47:39 crc kubenswrapper[4781]: I1202 09:47:39.692088 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1" (UID: "bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:47:39 crc kubenswrapper[4781]: I1202 09:47:39.741474 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc5g8\" (UniqueName: \"kubernetes.io/projected/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-kube-api-access-gc5g8\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:39 crc kubenswrapper[4781]: I1202 09:47:39.741519 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:40 crc kubenswrapper[4781]: I1202 09:47:40.034004 4781 generic.go:334] "Generic (PLEG): container finished" podID="bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1" containerID="8647453b2102fa649b05023dade2b39b33f6485b73e04331b7e2c73f96181914" exitCode=0 Dec 02 09:47:40 crc kubenswrapper[4781]: I1202 09:47:40.034060 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9t28j" event={"ID":"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1","Type":"ContainerDied","Data":"8647453b2102fa649b05023dade2b39b33f6485b73e04331b7e2c73f96181914"} Dec 02 09:47:40 crc kubenswrapper[4781]: I1202 09:47:40.034116 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9t28j" event={"ID":"bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1","Type":"ContainerDied","Data":"822b0f87ea8f6586ba7228d19c8e01cb7a360c2af60ba74b162f72578e2eb8fc"} Dec 02 09:47:40 crc kubenswrapper[4781]: I1202 09:47:40.034145 4781 scope.go:117] "RemoveContainer" containerID="8647453b2102fa649b05023dade2b39b33f6485b73e04331b7e2c73f96181914" Dec 02 09:47:40 crc kubenswrapper[4781]: I1202 09:47:40.034141 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9t28j" Dec 02 09:47:40 crc kubenswrapper[4781]: I1202 09:47:40.071437 4781 scope.go:117] "RemoveContainer" containerID="a07f2c8719be7b238972b19d0788250771aeb593492d7ee9dd6ea04e274dd11f" Dec 02 09:47:40 crc kubenswrapper[4781]: I1202 09:47:40.081310 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9t28j"] Dec 02 09:47:40 crc kubenswrapper[4781]: I1202 09:47:40.094348 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9t28j"] Dec 02 09:47:40 crc kubenswrapper[4781]: I1202 09:47:40.105170 4781 scope.go:117] "RemoveContainer" containerID="816dcd4f6e147cd7b6f262132e53efda737a302fc411362ac395e30030c70698" Dec 02 09:47:40 crc kubenswrapper[4781]: I1202 09:47:40.149618 4781 scope.go:117] "RemoveContainer" containerID="8647453b2102fa649b05023dade2b39b33f6485b73e04331b7e2c73f96181914" Dec 02 09:47:40 crc kubenswrapper[4781]: E1202 09:47:40.155972 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8647453b2102fa649b05023dade2b39b33f6485b73e04331b7e2c73f96181914\": container with ID starting with 8647453b2102fa649b05023dade2b39b33f6485b73e04331b7e2c73f96181914 not found: ID does not exist" containerID="8647453b2102fa649b05023dade2b39b33f6485b73e04331b7e2c73f96181914" Dec 02 09:47:40 crc kubenswrapper[4781]: I1202 09:47:40.156189 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8647453b2102fa649b05023dade2b39b33f6485b73e04331b7e2c73f96181914"} err="failed to get container status \"8647453b2102fa649b05023dade2b39b33f6485b73e04331b7e2c73f96181914\": rpc error: code = NotFound desc = could not find container \"8647453b2102fa649b05023dade2b39b33f6485b73e04331b7e2c73f96181914\": container with ID starting with 8647453b2102fa649b05023dade2b39b33f6485b73e04331b7e2c73f96181914 not found: ID does not exist" Dec 02 09:47:40 crc kubenswrapper[4781]: I1202 09:47:40.156337 4781 scope.go:117] "RemoveContainer" containerID="a07f2c8719be7b238972b19d0788250771aeb593492d7ee9dd6ea04e274dd11f" Dec 02 09:47:40 crc kubenswrapper[4781]: E1202 09:47:40.156833 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a07f2c8719be7b238972b19d0788250771aeb593492d7ee9dd6ea04e274dd11f\": container with ID starting with a07f2c8719be7b238972b19d0788250771aeb593492d7ee9dd6ea04e274dd11f not found: ID does not exist" containerID="a07f2c8719be7b238972b19d0788250771aeb593492d7ee9dd6ea04e274dd11f" Dec 02 09:47:40 crc kubenswrapper[4781]: I1202 09:47:40.156955 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07f2c8719be7b238972b19d0788250771aeb593492d7ee9dd6ea04e274dd11f"} err="failed to get container status \"a07f2c8719be7b238972b19d0788250771aeb593492d7ee9dd6ea04e274dd11f\": rpc error: code = NotFound desc = could not find container \"a07f2c8719be7b238972b19d0788250771aeb593492d7ee9dd6ea04e274dd11f\": container with ID starting with a07f2c8719be7b238972b19d0788250771aeb593492d7ee9dd6ea04e274dd11f not found: ID does not exist" Dec 02 09:47:40 crc kubenswrapper[4781]: I1202 09:47:40.157047 4781 scope.go:117] "RemoveContainer" containerID="816dcd4f6e147cd7b6f262132e53efda737a302fc411362ac395e30030c70698" Dec 02 09:47:40 crc kubenswrapper[4781]: E1202 09:47:40.157415 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"816dcd4f6e147cd7b6f262132e53efda737a302fc411362ac395e30030c70698\": container with ID starting with 816dcd4f6e147cd7b6f262132e53efda737a302fc411362ac395e30030c70698 not found: ID does not exist" containerID="816dcd4f6e147cd7b6f262132e53efda737a302fc411362ac395e30030c70698" Dec 02 09:47:40 crc kubenswrapper[4781]: I1202 09:47:40.157532 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"816dcd4f6e147cd7b6f262132e53efda737a302fc411362ac395e30030c70698"} err="failed to get container status \"816dcd4f6e147cd7b6f262132e53efda737a302fc411362ac395e30030c70698\": rpc error: code = NotFound desc = could not find container \"816dcd4f6e147cd7b6f262132e53efda737a302fc411362ac395e30030c70698\": container with ID starting with 816dcd4f6e147cd7b6f262132e53efda737a302fc411362ac395e30030c70698 not found: ID does not exist" Dec 02 09:47:41 crc kubenswrapper[4781]: I1202 09:47:41.510839 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1" path="/var/lib/kubelet/pods/bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1/volumes" Dec 02 09:47:41 crc kubenswrapper[4781]: I1202 09:47:41.851942 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 09:47:41 crc kubenswrapper[4781]: I1202 09:47:41.852181 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="284c6c26-76ca-4800-b40f-51528de0c015" containerName="kube-state-metrics" containerID="cri-o://a167a50db94270b190aadd67c5a28422c38e064f8ad600d65560cd199860b782" gracePeriod=30 Dec 02 09:47:42 crc kubenswrapper[4781]: I1202 09:47:42.472483 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 09:47:42 crc kubenswrapper[4781]: I1202 09:47:42.600224 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld4pf\" (UniqueName: \"kubernetes.io/projected/284c6c26-76ca-4800-b40f-51528de0c015-kube-api-access-ld4pf\") pod \"284c6c26-76ca-4800-b40f-51528de0c015\" (UID: \"284c6c26-76ca-4800-b40f-51528de0c015\") " Dec 02 09:47:42 crc kubenswrapper[4781]: I1202 09:47:42.606994 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/284c6c26-76ca-4800-b40f-51528de0c015-kube-api-access-ld4pf" (OuterVolumeSpecName: "kube-api-access-ld4pf") pod "284c6c26-76ca-4800-b40f-51528de0c015" (UID: "284c6c26-76ca-4800-b40f-51528de0c015"). InnerVolumeSpecName "kube-api-access-ld4pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:47:42 crc kubenswrapper[4781]: I1202 09:47:42.702713 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld4pf\" (UniqueName: \"kubernetes.io/projected/284c6c26-76ca-4800-b40f-51528de0c015-kube-api-access-ld4pf\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.062653 4781 generic.go:334] "Generic (PLEG): container finished" podID="284c6c26-76ca-4800-b40f-51528de0c015" containerID="a167a50db94270b190aadd67c5a28422c38e064f8ad600d65560cd199860b782" exitCode=2 Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.062694 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.062698 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"284c6c26-76ca-4800-b40f-51528de0c015","Type":"ContainerDied","Data":"a167a50db94270b190aadd67c5a28422c38e064f8ad600d65560cd199860b782"} Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.062725 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"284c6c26-76ca-4800-b40f-51528de0c015","Type":"ContainerDied","Data":"27c6e4ded22a7b7f93cac6065ac88dcb766b42ab9f275e7c2e564b1f0ef2b18d"} Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.062745 4781 scope.go:117] "RemoveContainer" containerID="a167a50db94270b190aadd67c5a28422c38e064f8ad600d65560cd199860b782" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.088371 4781 scope.go:117] "RemoveContainer" containerID="a167a50db94270b190aadd67c5a28422c38e064f8ad600d65560cd199860b782" Dec 02 09:47:43 crc kubenswrapper[4781]: E1202 09:47:43.088967 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a167a50db94270b190aadd67c5a28422c38e064f8ad600d65560cd199860b782\": container with ID starting with a167a50db94270b190aadd67c5a28422c38e064f8ad600d65560cd199860b782 not found: ID does not exist" containerID="a167a50db94270b190aadd67c5a28422c38e064f8ad600d65560cd199860b782" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.089017 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a167a50db94270b190aadd67c5a28422c38e064f8ad600d65560cd199860b782"} err="failed to get container status \"a167a50db94270b190aadd67c5a28422c38e064f8ad600d65560cd199860b782\": rpc error: code = NotFound desc = could not find container \"a167a50db94270b190aadd67c5a28422c38e064f8ad600d65560cd199860b782\": container with ID starting with a167a50db94270b190aadd67c5a28422c38e064f8ad600d65560cd199860b782 not found: ID does not exist" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.096394 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.109258 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.119382 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 09:47:43 crc kubenswrapper[4781]: E1202 09:47:43.119888 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1" containerName="extract-content" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.119912 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1" containerName="extract-content" Dec 02 09:47:43 crc kubenswrapper[4781]: E1202 09:47:43.119944 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1" containerName="registry-server" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.119954 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1" containerName="registry-server" Dec 02 09:47:43 crc kubenswrapper[4781]: E1202 09:47:43.119984 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1" containerName="extract-utilities" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.119993 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1" containerName="extract-utilities" Dec 02 09:47:43 crc kubenswrapper[4781]: E1202 09:47:43.120010 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="284c6c26-76ca-4800-b40f-51528de0c015" containerName="kube-state-metrics" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.120018 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="284c6c26-76ca-4800-b40f-51528de0c015" containerName="kube-state-metrics" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.120236 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdfa91ac-8e30-4273-bff8-2c7d8a5ed9e1" containerName="registry-server" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.120268 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="284c6c26-76ca-4800-b40f-51528de0c015" containerName="kube-state-metrics" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.121065 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.130801 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.134625 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.134785 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.211852 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knjf6\" (UniqueName: \"kubernetes.io/projected/2546f353-d520-44ea-8040-c41223665f1f-kube-api-access-knjf6\") pod \"kube-state-metrics-0\" (UID: \"2546f353-d520-44ea-8040-c41223665f1f\") " pod="openstack/kube-state-metrics-0" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.211936 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2546f353-d520-44ea-8040-c41223665f1f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2546f353-d520-44ea-8040-c41223665f1f\") " pod="openstack/kube-state-metrics-0" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.211980 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2546f353-d520-44ea-8040-c41223665f1f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2546f353-d520-44ea-8040-c41223665f1f\") " pod="openstack/kube-state-metrics-0" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.212003 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2546f353-d520-44ea-8040-c41223665f1f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2546f353-d520-44ea-8040-c41223665f1f\") " pod="openstack/kube-state-metrics-0" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.313729 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2546f353-d520-44ea-8040-c41223665f1f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2546f353-d520-44ea-8040-c41223665f1f\") " pod="openstack/kube-state-metrics-0" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.314135 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2546f353-d520-44ea-8040-c41223665f1f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2546f353-d520-44ea-8040-c41223665f1f\") " pod="openstack/kube-state-metrics-0" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.314165 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2546f353-d520-44ea-8040-c41223665f1f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2546f353-d520-44ea-8040-c41223665f1f\") " pod="openstack/kube-state-metrics-0" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.314330 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knjf6\" (UniqueName: \"kubernetes.io/projected/2546f353-d520-44ea-8040-c41223665f1f-kube-api-access-knjf6\") pod \"kube-state-metrics-0\" (UID: \"2546f353-d520-44ea-8040-c41223665f1f\") " pod="openstack/kube-state-metrics-0" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.320716 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2546f353-d520-44ea-8040-c41223665f1f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2546f353-d520-44ea-8040-c41223665f1f\") " pod="openstack/kube-state-metrics-0" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.320803 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2546f353-d520-44ea-8040-c41223665f1f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2546f353-d520-44ea-8040-c41223665f1f\") " pod="openstack/kube-state-metrics-0" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.321357 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2546f353-d520-44ea-8040-c41223665f1f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2546f353-d520-44ea-8040-c41223665f1f\") " pod="openstack/kube-state-metrics-0" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.345663 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knjf6\" (UniqueName: \"kubernetes.io/projected/2546f353-d520-44ea-8040-c41223665f1f-kube-api-access-knjf6\") pod \"kube-state-metrics-0\" (UID: \"2546f353-d520-44ea-8040-c41223665f1f\") " pod="openstack/kube-state-metrics-0" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.451047 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.517382 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="284c6c26-76ca-4800-b40f-51528de0c015" path="/var/lib/kubelet/pods/284c6c26-76ca-4800-b40f-51528de0c015/volumes" Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.919778 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.941445 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.942406 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f695df8-9072-4528-891a-78e212a41af6" containerName="proxy-httpd" containerID="cri-o://e83d966537f5027fcedea16bcd5203bfe0c7321ba4c7f0f64f9bbc8cff4124ef" gracePeriod=30 Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.942588 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f695df8-9072-4528-891a-78e212a41af6" containerName="sg-core" containerID="cri-o://0cb7c95978209aeffebb9731c513878b8ce90cc6f16d59fa6bfcd78ee73d4af4" gracePeriod=30 Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.941912 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f695df8-9072-4528-891a-78e212a41af6" containerName="ceilometer-central-agent" containerID="cri-o://58170e2dbf418d5cf29396500c9d93623007dc60a7a701597b03f9c102048e26" gracePeriod=30 Dec 02 09:47:43 crc kubenswrapper[4781]: I1202 09:47:43.942733 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f695df8-9072-4528-891a-78e212a41af6" containerName="ceilometer-notification-agent" containerID="cri-o://2a54635b26f067dc174fe21eb06cef545561aa10ac6a76511f58cce7f9584c06" gracePeriod=30 Dec 02 09:47:44 crc kubenswrapper[4781]: I1202 09:47:44.079375 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2546f353-d520-44ea-8040-c41223665f1f","Type":"ContainerStarted","Data":"57fa478790aea57798f590458fbc56ffeca016856cd75936f07bc3c8b8d91860"} Dec 02 09:47:44 crc kubenswrapper[4781]: I1202 09:47:44.082111 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f695df8-9072-4528-891a-78e212a41af6" containerID="0cb7c95978209aeffebb9731c513878b8ce90cc6f16d59fa6bfcd78ee73d4af4" exitCode=2 Dec 02 09:47:44 crc kubenswrapper[4781]: I1202 09:47:44.082259 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f695df8-9072-4528-891a-78e212a41af6","Type":"ContainerDied","Data":"0cb7c95978209aeffebb9731c513878b8ce90cc6f16d59fa6bfcd78ee73d4af4"} Dec 02 09:47:44 crc kubenswrapper[4781]: I1202 09:47:44.297017 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 09:47:44 crc kubenswrapper[4781]: I1202 09:47:44.336771 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 09:47:45 crc kubenswrapper[4781]: I1202 09:47:45.093085 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f695df8-9072-4528-891a-78e212a41af6" containerID="e83d966537f5027fcedea16bcd5203bfe0c7321ba4c7f0f64f9bbc8cff4124ef" exitCode=0 Dec 02 09:47:45 crc kubenswrapper[4781]: I1202 09:47:45.093384 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f695df8-9072-4528-891a-78e212a41af6" containerID="58170e2dbf418d5cf29396500c9d93623007dc60a7a701597b03f9c102048e26" exitCode=0 Dec 02 09:47:45 crc kubenswrapper[4781]: I1202 09:47:45.093152 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f695df8-9072-4528-891a-78e212a41af6","Type":"ContainerDied","Data":"e83d966537f5027fcedea16bcd5203bfe0c7321ba4c7f0f64f9bbc8cff4124ef"} Dec 02 09:47:45 crc kubenswrapper[4781]: I1202 09:47:45.093454 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f695df8-9072-4528-891a-78e212a41af6","Type":"ContainerDied","Data":"58170e2dbf418d5cf29396500c9d93623007dc60a7a701597b03f9c102048e26"} Dec 02 09:47:45 crc kubenswrapper[4781]: I1202 09:47:45.095475 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2546f353-d520-44ea-8040-c41223665f1f","Type":"ContainerStarted","Data":"d0657d235d3c0932e31372dedc642c3632e3e1ea6538e2b360c2fa572466f1d1"} Dec 02 09:47:45 crc kubenswrapper[4781]: I1202 09:47:45.095664 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 02 09:47:45 crc kubenswrapper[4781]: I1202 09:47:45.120893 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.726637835 podStartE2EDuration="2.120869904s" podCreationTimestamp="2025-12-02 09:47:43 +0000 UTC" firstStartedPulling="2025-12-02 09:47:43.925116297 +0000 UTC m=+1626.748990176" lastFinishedPulling="2025-12-02 09:47:44.319348356 +0000 UTC m=+1627.143222245" observedRunningTime="2025-12-02 09:47:45.110253979 +0000 UTC m=+1627.934127868" watchObservedRunningTime="2025-12-02 09:47:45.120869904 +0000 UTC m=+1627.944743793" Dec 02 09:47:45 crc kubenswrapper[4781]: I1202 09:47:45.132581 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 09:47:45 crc kubenswrapper[4781]: I1202 09:47:45.348811 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 09:47:45 crc kubenswrapper[4781]: I1202 09:47:45.348863 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 09:47:46 crc kubenswrapper[4781]: I1202 09:47:46.431236 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 09:47:46 crc kubenswrapper[4781]: I1202 09:47:46.431747 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 09:47:46 crc kubenswrapper[4781]: I1202 09:47:46.589282 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 02 09:47:49 crc kubenswrapper[4781]: I1202 09:47:49.792509 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:47:49 crc kubenswrapper[4781]: I1202 09:47:49.945625 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f695df8-9072-4528-891a-78e212a41af6-log-httpd\") pod \"2f695df8-9072-4528-891a-78e212a41af6\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " Dec 02 09:47:49 crc kubenswrapper[4781]: I1202 09:47:49.945723 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-scripts\") pod \"2f695df8-9072-4528-891a-78e212a41af6\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " Dec 02 09:47:49 crc kubenswrapper[4781]: I1202 09:47:49.945792 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj7c9\" (UniqueName: \"kubernetes.io/projected/2f695df8-9072-4528-891a-78e212a41af6-kube-api-access-pj7c9\") pod \"2f695df8-9072-4528-891a-78e212a41af6\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " Dec 02 09:47:49 crc kubenswrapper[4781]: I1202 09:47:49.945896 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-sg-core-conf-yaml\") pod \"2f695df8-9072-4528-891a-78e212a41af6\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " Dec 02 09:47:49 crc kubenswrapper[4781]: I1202 09:47:49.945954 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-combined-ca-bundle\") pod \"2f695df8-9072-4528-891a-78e212a41af6\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " Dec 02 09:47:49 crc kubenswrapper[4781]: I1202 09:47:49.945988 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-config-data\") pod \"2f695df8-9072-4528-891a-78e212a41af6\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " Dec 02 09:47:49 crc kubenswrapper[4781]: I1202 09:47:49.946045 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f695df8-9072-4528-891a-78e212a41af6-run-httpd\") pod \"2f695df8-9072-4528-891a-78e212a41af6\" (UID: \"2f695df8-9072-4528-891a-78e212a41af6\") " Dec 02 09:47:49 crc kubenswrapper[4781]: I1202 09:47:49.946895 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f695df8-9072-4528-891a-78e212a41af6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2f695df8-9072-4528-891a-78e212a41af6" (UID: "2f695df8-9072-4528-891a-78e212a41af6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:47:49 crc kubenswrapper[4781]: I1202 09:47:49.947178 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f695df8-9072-4528-891a-78e212a41af6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2f695df8-9072-4528-891a-78e212a41af6" (UID: "2f695df8-9072-4528-891a-78e212a41af6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:47:49 crc kubenswrapper[4781]: I1202 09:47:49.954165 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-scripts" (OuterVolumeSpecName: "scripts") pod "2f695df8-9072-4528-891a-78e212a41af6" (UID: "2f695df8-9072-4528-891a-78e212a41af6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:49 crc kubenswrapper[4781]: I1202 09:47:49.954188 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f695df8-9072-4528-891a-78e212a41af6-kube-api-access-pj7c9" (OuterVolumeSpecName: "kube-api-access-pj7c9") pod "2f695df8-9072-4528-891a-78e212a41af6" (UID: "2f695df8-9072-4528-891a-78e212a41af6"). InnerVolumeSpecName "kube-api-access-pj7c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:47:49 crc kubenswrapper[4781]: I1202 09:47:49.978020 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2f695df8-9072-4528-891a-78e212a41af6" (UID: "2f695df8-9072-4528-891a-78e212a41af6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.018685 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f695df8-9072-4528-891a-78e212a41af6" (UID: "2f695df8-9072-4528-891a-78e212a41af6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.047860 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.047896 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj7c9\" (UniqueName: \"kubernetes.io/projected/2f695df8-9072-4528-891a-78e212a41af6-kube-api-access-pj7c9\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.047908 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.047932 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.047944 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f695df8-9072-4528-891a-78e212a41af6-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.047954 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f695df8-9072-4528-891a-78e212a41af6-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.050000 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-config-data" (OuterVolumeSpecName: "config-data") pod "2f695df8-9072-4528-891a-78e212a41af6" (UID: "2f695df8-9072-4528-891a-78e212a41af6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.146227 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f695df8-9072-4528-891a-78e212a41af6" containerID="2a54635b26f067dc174fe21eb06cef545561aa10ac6a76511f58cce7f9584c06" exitCode=0 Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.146275 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f695df8-9072-4528-891a-78e212a41af6","Type":"ContainerDied","Data":"2a54635b26f067dc174fe21eb06cef545561aa10ac6a76511f58cce7f9584c06"} Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.146311 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f695df8-9072-4528-891a-78e212a41af6","Type":"ContainerDied","Data":"4e28d630ed8d8741790a7f07941812017e9516bd93973592045a4d3216e873be"} Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.146336 4781 scope.go:117] "RemoveContainer" containerID="e83d966537f5027fcedea16bcd5203bfe0c7321ba4c7f0f64f9bbc8cff4124ef" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.146413 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.150140 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f695df8-9072-4528-891a-78e212a41af6-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.181513 4781 scope.go:117] "RemoveContainer" containerID="0cb7c95978209aeffebb9731c513878b8ce90cc6f16d59fa6bfcd78ee73d4af4" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.187188 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.208769 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.213464 4781 scope.go:117] "RemoveContainer" containerID="2a54635b26f067dc174fe21eb06cef545561aa10ac6a76511f58cce7f9584c06" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.222704 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:47:50 crc kubenswrapper[4781]: E1202 09:47:50.223235 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f695df8-9072-4528-891a-78e212a41af6" containerName="proxy-httpd" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.223256 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f695df8-9072-4528-891a-78e212a41af6" containerName="proxy-httpd" Dec 02 09:47:50 crc kubenswrapper[4781]: E1202 09:47:50.223288 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f695df8-9072-4528-891a-78e212a41af6" containerName="ceilometer-notification-agent" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.223295 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f695df8-9072-4528-891a-78e212a41af6" containerName="ceilometer-notification-agent" Dec 02 09:47:50 crc kubenswrapper[4781]: E1202 09:47:50.223310 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f695df8-9072-4528-891a-78e212a41af6" containerName="sg-core" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.223319 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f695df8-9072-4528-891a-78e212a41af6" containerName="sg-core" Dec 02 09:47:50 crc kubenswrapper[4781]: E1202 09:47:50.223338 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f695df8-9072-4528-891a-78e212a41af6" containerName="ceilometer-central-agent" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.223344 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f695df8-9072-4528-891a-78e212a41af6" containerName="ceilometer-central-agent" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.223558 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f695df8-9072-4528-891a-78e212a41af6" containerName="sg-core" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.223583 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f695df8-9072-4528-891a-78e212a41af6" containerName="ceilometer-central-agent" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.223595 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f695df8-9072-4528-891a-78e212a41af6" containerName="proxy-httpd" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.223607 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f695df8-9072-4528-891a-78e212a41af6" containerName="ceilometer-notification-agent" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.225580 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.228610 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.228832 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.232731 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.237417 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.255541 4781 scope.go:117] "RemoveContainer" containerID="58170e2dbf418d5cf29396500c9d93623007dc60a7a701597b03f9c102048e26" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.282634 4781 scope.go:117] "RemoveContainer" containerID="e83d966537f5027fcedea16bcd5203bfe0c7321ba4c7f0f64f9bbc8cff4124ef" Dec 02 09:47:50 crc kubenswrapper[4781]: E1202 09:47:50.283189 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e83d966537f5027fcedea16bcd5203bfe0c7321ba4c7f0f64f9bbc8cff4124ef\": container with ID starting with e83d966537f5027fcedea16bcd5203bfe0c7321ba4c7f0f64f9bbc8cff4124ef not found: ID does not exist" containerID="e83d966537f5027fcedea16bcd5203bfe0c7321ba4c7f0f64f9bbc8cff4124ef" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.283239 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e83d966537f5027fcedea16bcd5203bfe0c7321ba4c7f0f64f9bbc8cff4124ef"} err="failed to get container status \"e83d966537f5027fcedea16bcd5203bfe0c7321ba4c7f0f64f9bbc8cff4124ef\": rpc error: code = NotFound desc = could not find container \"e83d966537f5027fcedea16bcd5203bfe0c7321ba4c7f0f64f9bbc8cff4124ef\": container with ID starting with e83d966537f5027fcedea16bcd5203bfe0c7321ba4c7f0f64f9bbc8cff4124ef not found: ID does not exist" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.283276 4781 scope.go:117] "RemoveContainer" containerID="0cb7c95978209aeffebb9731c513878b8ce90cc6f16d59fa6bfcd78ee73d4af4" Dec 02 09:47:50 crc kubenswrapper[4781]: E1202 09:47:50.283750 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cb7c95978209aeffebb9731c513878b8ce90cc6f16d59fa6bfcd78ee73d4af4\": container with ID starting with 0cb7c95978209aeffebb9731c513878b8ce90cc6f16d59fa6bfcd78ee73d4af4 not found: ID does not exist" containerID="0cb7c95978209aeffebb9731c513878b8ce90cc6f16d59fa6bfcd78ee73d4af4" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.283817 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cb7c95978209aeffebb9731c513878b8ce90cc6f16d59fa6bfcd78ee73d4af4"} err="failed to get container status \"0cb7c95978209aeffebb9731c513878b8ce90cc6f16d59fa6bfcd78ee73d4af4\": rpc error: code = NotFound desc = could not find container \"0cb7c95978209aeffebb9731c513878b8ce90cc6f16d59fa6bfcd78ee73d4af4\": container with ID starting with 0cb7c95978209aeffebb9731c513878b8ce90cc6f16d59fa6bfcd78ee73d4af4 not found: ID does not exist" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.283837 4781 scope.go:117] "RemoveContainer" containerID="2a54635b26f067dc174fe21eb06cef545561aa10ac6a76511f58cce7f9584c06" Dec 02 09:47:50 crc kubenswrapper[4781]: E1202 09:47:50.284269 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a54635b26f067dc174fe21eb06cef545561aa10ac6a76511f58cce7f9584c06\": container with ID starting with 2a54635b26f067dc174fe21eb06cef545561aa10ac6a76511f58cce7f9584c06 not found: ID does not exist" containerID="2a54635b26f067dc174fe21eb06cef545561aa10ac6a76511f58cce7f9584c06" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.284297 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a54635b26f067dc174fe21eb06cef545561aa10ac6a76511f58cce7f9584c06"} err="failed to get container status \"2a54635b26f067dc174fe21eb06cef545561aa10ac6a76511f58cce7f9584c06\": rpc error: code = NotFound desc = could not find container \"2a54635b26f067dc174fe21eb06cef545561aa10ac6a76511f58cce7f9584c06\": container with ID starting with 2a54635b26f067dc174fe21eb06cef545561aa10ac6a76511f58cce7f9584c06 not found: ID does not exist" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.284315 4781 scope.go:117] "RemoveContainer" containerID="58170e2dbf418d5cf29396500c9d93623007dc60a7a701597b03f9c102048e26" Dec 02 09:47:50 crc kubenswrapper[4781]: E1202 09:47:50.284646 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58170e2dbf418d5cf29396500c9d93623007dc60a7a701597b03f9c102048e26\": container with ID starting with 58170e2dbf418d5cf29396500c9d93623007dc60a7a701597b03f9c102048e26 not found: ID does not exist" containerID="58170e2dbf418d5cf29396500c9d93623007dc60a7a701597b03f9c102048e26" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.284681 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58170e2dbf418d5cf29396500c9d93623007dc60a7a701597b03f9c102048e26"} err="failed to get container status \"58170e2dbf418d5cf29396500c9d93623007dc60a7a701597b03f9c102048e26\": rpc error: code = NotFound desc = could not find container \"58170e2dbf418d5cf29396500c9d93623007dc60a7a701597b03f9c102048e26\": container with ID starting with 58170e2dbf418d5cf29396500c9d93623007dc60a7a701597b03f9c102048e26 not found: ID does not exist" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.353774 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.353842 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.353868 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-scripts\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.353899 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/472a52b0-7b96-407f-b36c-7e8c92c76bac-run-httpd\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.353945 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.354000 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-config-data\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.354049 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/472a52b0-7b96-407f-b36c-7e8c92c76bac-log-httpd\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.354068 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fssw\" (UniqueName: \"kubernetes.io/projected/472a52b0-7b96-407f-b36c-7e8c92c76bac-kube-api-access-7fssw\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.456446 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/472a52b0-7b96-407f-b36c-7e8c92c76bac-log-httpd\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.456511 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fssw\" (UniqueName: \"kubernetes.io/projected/472a52b0-7b96-407f-b36c-7e8c92c76bac-kube-api-access-7fssw\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.456609 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.456664 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.456711 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-scripts\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.457359 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/472a52b0-7b96-407f-b36c-7e8c92c76bac-log-httpd\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.457404 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/472a52b0-7b96-407f-b36c-7e8c92c76bac-run-httpd\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.457637 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/472a52b0-7b96-407f-b36c-7e8c92c76bac-run-httpd\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.458213 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.458320 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-config-data\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.461485 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.461571 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-scripts\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.461881 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-config-data\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.462305 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.463720 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.477070 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fssw\" (UniqueName: \"kubernetes.io/projected/472a52b0-7b96-407f-b36c-7e8c92c76bac-kube-api-access-7fssw\") pod \"ceilometer-0\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " pod="openstack/ceilometer-0" Dec 02 09:47:50 crc kubenswrapper[4781]: I1202 09:47:50.550811 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:47:51 crc kubenswrapper[4781]: W1202 09:47:50.999803 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod472a52b0_7b96_407f_b36c_7e8c92c76bac.slice/crio-70827a8fd73b8ae5439bcd78126055c66aa1fa802bc7a938adf6e3d5346427fa WatchSource:0}: Error finding container 70827a8fd73b8ae5439bcd78126055c66aa1fa802bc7a938adf6e3d5346427fa: Status 404 returned error can't find the container with id 70827a8fd73b8ae5439bcd78126055c66aa1fa802bc7a938adf6e3d5346427fa Dec 02 09:47:51 crc kubenswrapper[4781]: I1202 09:47:51.011000 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:47:51 crc kubenswrapper[4781]: I1202 09:47:51.155539 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"472a52b0-7b96-407f-b36c-7e8c92c76bac","Type":"ContainerStarted","Data":"70827a8fd73b8ae5439bcd78126055c66aa1fa802bc7a938adf6e3d5346427fa"} Dec 02 09:47:51 crc kubenswrapper[4781]: I1202 09:47:51.517556 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f695df8-9072-4528-891a-78e212a41af6" path="/var/lib/kubelet/pods/2f695df8-9072-4528-891a-78e212a41af6/volumes" Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.179595 4781 generic.go:334] "Generic (PLEG): container finished" podID="16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a" containerID="97b9ed60bb35c6c89587bac81f51c63e7a90c5f69f28dd0c0e327305537db1be" exitCode=137 Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.179676 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a","Type":"ContainerDied","Data":"97b9ed60bb35c6c89587bac81f51c63e7a90c5f69f28dd0c0e327305537db1be"} Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.180086 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a","Type":"ContainerDied","Data":"fcf2d1e2eee66f7ba52ada4d1aca41296fa4d5cb12bff4e270dd806aabe38a57"} Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.180103 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcf2d1e2eee66f7ba52ada4d1aca41296fa4d5cb12bff4e270dd806aabe38a57" Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.181607 4781 generic.go:334] "Generic (PLEG): container finished" podID="c59b1260-94b3-4f66-942d-d215e1c7715c" containerID="810b05d2e8b4be8ecf90799f992b8146399c1c08ece6e23e5d12240846e828f2" exitCode=137 Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.181633 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c59b1260-94b3-4f66-942d-d215e1c7715c","Type":"ContainerDied","Data":"810b05d2e8b4be8ecf90799f992b8146399c1c08ece6e23e5d12240846e828f2"} Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.222726 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.250713 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.392420 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59b1260-94b3-4f66-942d-d215e1c7715c-config-data\") pod \"c59b1260-94b3-4f66-942d-d215e1c7715c\" (UID: \"c59b1260-94b3-4f66-942d-d215e1c7715c\") " Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.392489 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-config-data\") pod \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\" (UID: \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\") " Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.392508 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-combined-ca-bundle\") pod \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\" (UID: \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\") " Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.392601 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hg8k\" (UniqueName: \"kubernetes.io/projected/c59b1260-94b3-4f66-942d-d215e1c7715c-kube-api-access-6hg8k\") pod \"c59b1260-94b3-4f66-942d-d215e1c7715c\" (UID: \"c59b1260-94b3-4f66-942d-d215e1c7715c\") " Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.392630 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59b1260-94b3-4f66-942d-d215e1c7715c-combined-ca-bundle\") pod \"c59b1260-94b3-4f66-942d-d215e1c7715c\" (UID: \"c59b1260-94b3-4f66-942d-d215e1c7715c\") " Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.392729 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-logs\") pod \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\" (UID: \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\") " Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.392790 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfr8j\" (UniqueName: \"kubernetes.io/projected/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-kube-api-access-qfr8j\") pod \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\" (UID: \"16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a\") " Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.393340 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-logs" (OuterVolumeSpecName: "logs") pod "16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a" (UID: "16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.398149 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-kube-api-access-qfr8j" (OuterVolumeSpecName: "kube-api-access-qfr8j") pod "16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a" (UID: "16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a"). InnerVolumeSpecName "kube-api-access-qfr8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.398695 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c59b1260-94b3-4f66-942d-d215e1c7715c-kube-api-access-6hg8k" (OuterVolumeSpecName: "kube-api-access-6hg8k") pod "c59b1260-94b3-4f66-942d-d215e1c7715c" (UID: "c59b1260-94b3-4f66-942d-d215e1c7715c"). InnerVolumeSpecName "kube-api-access-6hg8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.420635 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a" (UID: "16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.421990 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59b1260-94b3-4f66-942d-d215e1c7715c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c59b1260-94b3-4f66-942d-d215e1c7715c" (UID: "c59b1260-94b3-4f66-942d-d215e1c7715c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.427057 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59b1260-94b3-4f66-942d-d215e1c7715c-config-data" (OuterVolumeSpecName: "config-data") pod "c59b1260-94b3-4f66-942d-d215e1c7715c" (UID: "c59b1260-94b3-4f66-942d-d215e1c7715c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.431094 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-config-data" (OuterVolumeSpecName: "config-data") pod "16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a" (UID: "16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.495134 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.495167 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfr8j\" (UniqueName: \"kubernetes.io/projected/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-kube-api-access-qfr8j\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.495179 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59b1260-94b3-4f66-942d-d215e1c7715c-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.495190 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.495200 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.495208 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hg8k\" (UniqueName: \"kubernetes.io/projected/c59b1260-94b3-4f66-942d-d215e1c7715c-kube-api-access-6hg8k\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:52 crc kubenswrapper[4781]: I1202 09:47:52.495217 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59b1260-94b3-4f66-942d-d215e1c7715c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.190949 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"472a52b0-7b96-407f-b36c-7e8c92c76bac","Type":"ContainerStarted","Data":"024bbf683545eb8333abe0203ba1dca85c77f123c8c9df3ac4a2f09f36d3681a"} Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.192804 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c59b1260-94b3-4f66-942d-d215e1c7715c","Type":"ContainerDied","Data":"cedbbcc1bffdb90b713b804adbf7844254a09d6c42b1ab8b9ae30b98df36285f"} Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.192840 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.192856 4781 scope.go:117] "RemoveContainer" containerID="810b05d2e8b4be8ecf90799f992b8146399c1c08ece6e23e5d12240846e828f2" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.192826 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.237635 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.250340 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.267576 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.279844 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 09:47:53 crc kubenswrapper[4781]: E1202 09:47:53.280354 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a" containerName="nova-metadata-log" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.280380 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a" containerName="nova-metadata-log" Dec 02 09:47:53 crc kubenswrapper[4781]: E1202 09:47:53.280423 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c59b1260-94b3-4f66-942d-d215e1c7715c" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.280433 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c59b1260-94b3-4f66-942d-d215e1c7715c" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 09:47:53 crc kubenswrapper[4781]: E1202 09:47:53.280458 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a" containerName="nova-metadata-metadata" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.280468 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a" containerName="nova-metadata-metadata" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.280692 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c59b1260-94b3-4f66-942d-d215e1c7715c" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.280718 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a" containerName="nova-metadata-log" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.280737 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a" containerName="nova-metadata-metadata" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.281474 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.289165 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.290014 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.290302 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.290537 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.299995 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.307588 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.309840 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.315969 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.316475 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.341977 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.413823 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.413871 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a400f59-49d8-4e6d-8a84-24b680b052cf-logs\") pod \"nova-metadata-0\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " pod="openstack/nova-metadata-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.413894 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.413942 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.413973 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " pod="openstack/nova-metadata-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.414051 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4mhw\" (UniqueName: \"kubernetes.io/projected/3a400f59-49d8-4e6d-8a84-24b680b052cf-kube-api-access-x4mhw\") pod \"nova-metadata-0\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " pod="openstack/nova-metadata-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.414071 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.414088 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blpmn\" (UniqueName: \"kubernetes.io/projected/e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68-kube-api-access-blpmn\") pod \"nova-cell1-novncproxy-0\" (UID: \"e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.414174 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-config-data\") pod \"nova-metadata-0\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " pod="openstack/nova-metadata-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.414255 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " pod="openstack/nova-metadata-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.517519 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.517585 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " pod="openstack/nova-metadata-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.517638 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4mhw\" (UniqueName: \"kubernetes.io/projected/3a400f59-49d8-4e6d-8a84-24b680b052cf-kube-api-access-x4mhw\") pod \"nova-metadata-0\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " pod="openstack/nova-metadata-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.517658 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.517675 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blpmn\" (UniqueName: \"kubernetes.io/projected/e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68-kube-api-access-blpmn\") pod \"nova-cell1-novncproxy-0\" (UID: \"e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.517755 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-config-data\") pod \"nova-metadata-0\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " pod="openstack/nova-metadata-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.517793 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " pod="openstack/nova-metadata-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.517819 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.517876 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a400f59-49d8-4e6d-8a84-24b680b052cf-logs\") pod \"nova-metadata-0\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " pod="openstack/nova-metadata-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.517898 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.519285 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a400f59-49d8-4e6d-8a84-24b680b052cf-logs\") pod \"nova-metadata-0\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " pod="openstack/nova-metadata-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.523241 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-config-data\") pod \"nova-metadata-0\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " pod="openstack/nova-metadata-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.523885 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " pod="openstack/nova-metadata-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.527572 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.529506 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.531554 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a" path="/var/lib/kubelet/pods/16a93b44-e8bd-4b5f-bfd9-9c46179c3b8a/volumes" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.532148 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c59b1260-94b3-4f66-942d-d215e1c7715c" path="/var/lib/kubelet/pods/c59b1260-94b3-4f66-942d-d215e1c7715c/volumes" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.533872 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " pod="openstack/nova-metadata-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.538070 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.538965 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blpmn\" (UniqueName: \"kubernetes.io/projected/e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68-kube-api-access-blpmn\") pod \"nova-cell1-novncproxy-0\" (UID: \"e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.542462 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.543046 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.549601 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4mhw\" (UniqueName: \"kubernetes.io/projected/3a400f59-49d8-4e6d-8a84-24b680b052cf-kube-api-access-x4mhw\") pod \"nova-metadata-0\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " pod="openstack/nova-metadata-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.610487 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:53 crc kubenswrapper[4781]: I1202 09:47:53.640504 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 09:47:54 crc kubenswrapper[4781]: W1202 09:47:54.110095 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9f84c1d_0c33_4c5a_8e53_a5cf635e4a68.slice/crio-852c62a5c32d47d1c031cc022b15640ab58f6f7ee1ddd935afa1b0f683deb4cf WatchSource:0}: Error finding container 852c62a5c32d47d1c031cc022b15640ab58f6f7ee1ddd935afa1b0f683deb4cf: Status 404 returned error can't find the container with id 852c62a5c32d47d1c031cc022b15640ab58f6f7ee1ddd935afa1b0f683deb4cf Dec 02 09:47:54 crc kubenswrapper[4781]: I1202 09:47:54.110493 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 09:47:54 crc kubenswrapper[4781]: I1202 09:47:54.232106 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:47:54 crc kubenswrapper[4781]: I1202 09:47:54.251818 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68","Type":"ContainerStarted","Data":"852c62a5c32d47d1c031cc022b15640ab58f6f7ee1ddd935afa1b0f683deb4cf"} Dec 02 09:47:54 crc kubenswrapper[4781]: I1202 09:47:54.254326 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"472a52b0-7b96-407f-b36c-7e8c92c76bac","Type":"ContainerStarted","Data":"0f450970f1446b083f937448b4e2606ec998b88e83260d607b5b09300e66b68e"} Dec 02 09:47:55 crc kubenswrapper[4781]: I1202 09:47:55.266668 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"472a52b0-7b96-407f-b36c-7e8c92c76bac","Type":"ContainerStarted","Data":"0ad8d61bd436aa2e62a5cc83f773ff2b3fcb23ed8cb91bd0d7458158dfa770e5"} Dec 02 09:47:55 crc kubenswrapper[4781]: I1202 09:47:55.269279 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68","Type":"ContainerStarted","Data":"1086dce247a72fbae7e51848ee140148cca1fa5048a920eef7c9ac27293dbce4"} Dec 02 09:47:55 crc kubenswrapper[4781]: I1202 09:47:55.271972 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a400f59-49d8-4e6d-8a84-24b680b052cf","Type":"ContainerStarted","Data":"08a35e39a052bff33bcbe0911370ab32856466d1eb578f0adc35360e7e231e4d"} Dec 02 09:47:55 crc kubenswrapper[4781]: I1202 09:47:55.272037 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a400f59-49d8-4e6d-8a84-24b680b052cf","Type":"ContainerStarted","Data":"bd092bc04703af445a96d0dc6ffd7109012effad2faa8cdf4308be22814ad920"} Dec 02 09:47:55 crc kubenswrapper[4781]: I1202 09:47:55.272050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a400f59-49d8-4e6d-8a84-24b680b052cf","Type":"ContainerStarted","Data":"d6f7e4a728b6bcfde8954710b1b328267e2f8e761fcef8253ab3fff50256b581"} Dec 02 09:47:55 crc kubenswrapper[4781]: I1202 09:47:55.292360 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.292338034 podStartE2EDuration="2.292338034s" podCreationTimestamp="2025-12-02 09:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:47:55.284687397 +0000 UTC m=+1638.108561276" watchObservedRunningTime="2025-12-02 09:47:55.292338034 +0000 UTC m=+1638.116211903" Dec 02 09:47:55 crc kubenswrapper[4781]: I1202 09:47:55.310766 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.31074596 podStartE2EDuration="2.31074596s" podCreationTimestamp="2025-12-02 09:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:47:55.303022021 +0000 UTC m=+1638.126895900" watchObservedRunningTime="2025-12-02 09:47:55.31074596 +0000 UTC m=+1638.134619839" Dec 02 09:47:55 crc kubenswrapper[4781]: I1202 09:47:55.364834 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 09:47:55 crc kubenswrapper[4781]: I1202 09:47:55.365451 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 09:47:55 crc kubenswrapper[4781]: I1202 09:47:55.366459 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 09:47:55 crc kubenswrapper[4781]: I1202 09:47:55.370284 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.281387 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.287083 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.559485 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-vhxb5"] Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.561362 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.567253 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-vhxb5"] Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.702547 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.702630 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-config\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.702666 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.702713 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.702772 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4wsr\" (UniqueName: \"kubernetes.io/projected/27151ccc-51de-40b5-8145-6ca697fcf788-kube-api-access-r4wsr\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.702841 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.804984 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-config\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.805042 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.805135 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.806551 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.806561 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.806583 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-config\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.805166 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4wsr\" (UniqueName: \"kubernetes.io/projected/27151ccc-51de-40b5-8145-6ca697fcf788-kube-api-access-r4wsr\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.806819 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.807137 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.807818 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.808315 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.823069 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4wsr\" (UniqueName: \"kubernetes.io/projected/27151ccc-51de-40b5-8145-6ca697fcf788-kube-api-access-r4wsr\") pod \"dnsmasq-dns-89c5cd4d5-vhxb5\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:56 crc kubenswrapper[4781]: I1202 09:47:56.882471 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:57 crc kubenswrapper[4781]: I1202 09:47:57.294912 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"472a52b0-7b96-407f-b36c-7e8c92c76bac","Type":"ContainerStarted","Data":"b0539f61148e24b8a62ccf4da71662962c570e3fe2c906134428f098660f2192"} Dec 02 09:47:57 crc kubenswrapper[4781]: I1202 09:47:57.331422 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.037300349 podStartE2EDuration="7.331402366s" podCreationTimestamp="2025-12-02 09:47:50 +0000 UTC" firstStartedPulling="2025-12-02 09:47:51.002144732 +0000 UTC m=+1633.826018611" lastFinishedPulling="2025-12-02 09:47:56.296246749 +0000 UTC m=+1639.120120628" observedRunningTime="2025-12-02 09:47:57.31488934 +0000 UTC m=+1640.138763219" watchObservedRunningTime="2025-12-02 09:47:57.331402366 +0000 UTC m=+1640.155276255" Dec 02 09:47:57 crc kubenswrapper[4781]: I1202 09:47:57.439473 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-vhxb5"] Dec 02 09:47:58 crc kubenswrapper[4781]: I1202 09:47:58.305035 4781 generic.go:334] "Generic (PLEG): container finished" podID="27151ccc-51de-40b5-8145-6ca697fcf788" containerID="2af5994edb2020cfc3a4dceb1e1abe78aef8c6985c1a0c038a0a3253ba6aff49" exitCode=0 Dec 02 09:47:58 crc kubenswrapper[4781]: I1202 09:47:58.305163 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" event={"ID":"27151ccc-51de-40b5-8145-6ca697fcf788","Type":"ContainerDied","Data":"2af5994edb2020cfc3a4dceb1e1abe78aef8c6985c1a0c038a0a3253ba6aff49"} Dec 02 09:47:58 crc kubenswrapper[4781]: I1202 09:47:58.305966 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" event={"ID":"27151ccc-51de-40b5-8145-6ca697fcf788","Type":"ContainerStarted","Data":"87000a59626b413375965c3170d0fa7556ef16d2fc4fa0aee0f8faab9af53c8e"} Dec 02 09:47:58 crc kubenswrapper[4781]: I1202 09:47:58.306451 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 09:47:58 crc kubenswrapper[4781]: I1202 09:47:58.611787 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:47:58 crc kubenswrapper[4781]: I1202 09:47:58.641407 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 09:47:58 crc kubenswrapper[4781]: I1202 09:47:58.641455 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 09:47:59 crc kubenswrapper[4781]: I1202 09:47:59.038378 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:47:59 crc kubenswrapper[4781]: I1202 09:47:59.322181 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" event={"ID":"27151ccc-51de-40b5-8145-6ca697fcf788","Type":"ContainerStarted","Data":"66336063b7e5a589db076d665175e4e0e57d3b3e1913275c7823af85b847be70"} Dec 02 09:47:59 crc kubenswrapper[4781]: I1202 09:47:59.322278 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:47:59 crc kubenswrapper[4781]: I1202 09:47:59.322385 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" containerName="nova-api-log" containerID="cri-o://1e9c253db3d4499c4421a0be29f78f56f901c48e43e105abff413d540f00a655" gracePeriod=30 Dec 02 09:47:59 crc kubenswrapper[4781]: I1202 09:47:59.322722 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" containerName="nova-api-api" containerID="cri-o://478cd18c4c1a4759a237a68b2bac6ad5cd6591aa5d48537bf10f8ec5257aec34" gracePeriod=30 Dec 02 09:47:59 crc kubenswrapper[4781]: I1202 09:47:59.345251 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" podStartSLOduration=3.345226489 podStartE2EDuration="3.345226489s" podCreationTimestamp="2025-12-02 09:47:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:47:59.343985226 +0000 UTC m=+1642.167859115" watchObservedRunningTime="2025-12-02 09:47:59.345226489 +0000 UTC m=+1642.169100368" Dec 02 09:48:00 crc kubenswrapper[4781]: I1202 09:48:00.333630 4781 generic.go:334] "Generic (PLEG): container finished" podID="e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" containerID="1e9c253db3d4499c4421a0be29f78f56f901c48e43e105abff413d540f00a655" exitCode=143 Dec 02 09:48:00 crc kubenswrapper[4781]: I1202 09:48:00.333728 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d","Type":"ContainerDied","Data":"1e9c253db3d4499c4421a0be29f78f56f901c48e43e105abff413d540f00a655"} Dec 02 09:48:00 crc kubenswrapper[4781]: I1202 09:48:00.412255 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:48:00 crc kubenswrapper[4781]: I1202 09:48:00.412317 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:48:00 crc kubenswrapper[4781]: I1202 09:48:00.412367 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:48:00 crc kubenswrapper[4781]: I1202 09:48:00.413193 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed"} pod="openshift-machine-config-operator/machine-config-daemon-pzntm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:48:00 crc kubenswrapper[4781]: I1202 09:48:00.413257 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" containerID="cri-o://311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" gracePeriod=600 Dec 02 09:48:00 crc kubenswrapper[4781]: I1202 09:48:00.873821 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:48:00 crc kubenswrapper[4781]: I1202 09:48:00.874165 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerName="proxy-httpd" containerID="cri-o://b0539f61148e24b8a62ccf4da71662962c570e3fe2c906134428f098660f2192" gracePeriod=30 Dec 02 09:48:00 crc kubenswrapper[4781]: I1202 09:48:00.874207 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerName="ceilometer-notification-agent" containerID="cri-o://0f450970f1446b083f937448b4e2606ec998b88e83260d607b5b09300e66b68e" gracePeriod=30 Dec 02 09:48:00 crc kubenswrapper[4781]: I1202 09:48:00.874174 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerName="sg-core" containerID="cri-o://0ad8d61bd436aa2e62a5cc83f773ff2b3fcb23ed8cb91bd0d7458158dfa770e5" gracePeriod=30 Dec 02 09:48:00 crc kubenswrapper[4781]: I1202 09:48:00.874147 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerName="ceilometer-central-agent" containerID="cri-o://024bbf683545eb8333abe0203ba1dca85c77f123c8c9df3ac4a2f09f36d3681a" gracePeriod=30 Dec 02 09:48:01 crc kubenswrapper[4781]: E1202 09:48:01.052567 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.352770 4781 generic.go:334] "Generic (PLEG): container finished" podID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerID="b0539f61148e24b8a62ccf4da71662962c570e3fe2c906134428f098660f2192" exitCode=0 Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.352821 4781 generic.go:334] "Generic (PLEG): container finished" podID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerID="0ad8d61bd436aa2e62a5cc83f773ff2b3fcb23ed8cb91bd0d7458158dfa770e5" exitCode=2 Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.352831 4781 generic.go:334] "Generic (PLEG): container finished" podID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerID="0f450970f1446b083f937448b4e2606ec998b88e83260d607b5b09300e66b68e" exitCode=0 Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.352838 4781 generic.go:334] "Generic (PLEG): container finished" podID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerID="024bbf683545eb8333abe0203ba1dca85c77f123c8c9df3ac4a2f09f36d3681a" exitCode=0 Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.352897 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"472a52b0-7b96-407f-b36c-7e8c92c76bac","Type":"ContainerDied","Data":"b0539f61148e24b8a62ccf4da71662962c570e3fe2c906134428f098660f2192"} Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.353040 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"472a52b0-7b96-407f-b36c-7e8c92c76bac","Type":"ContainerDied","Data":"0ad8d61bd436aa2e62a5cc83f773ff2b3fcb23ed8cb91bd0d7458158dfa770e5"} Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.353058 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"472a52b0-7b96-407f-b36c-7e8c92c76bac","Type":"ContainerDied","Data":"0f450970f1446b083f937448b4e2606ec998b88e83260d607b5b09300e66b68e"} Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.353074 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"472a52b0-7b96-407f-b36c-7e8c92c76bac","Type":"ContainerDied","Data":"024bbf683545eb8333abe0203ba1dca85c77f123c8c9df3ac4a2f09f36d3681a"} Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.358194 4781 generic.go:334] "Generic (PLEG): container finished" podID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" exitCode=0 Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.358270 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerDied","Data":"311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed"} Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.358342 4781 scope.go:117] "RemoveContainer" containerID="da048919704efdd3fa5ef85ad71184cb8e182089a953151e1c406db7fb59ab6e" Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.359131 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:48:01 crc kubenswrapper[4781]: E1202 09:48:01.359468 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.809834 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.953425 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/472a52b0-7b96-407f-b36c-7e8c92c76bac-log-httpd\") pod \"472a52b0-7b96-407f-b36c-7e8c92c76bac\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.953590 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-config-data\") pod \"472a52b0-7b96-407f-b36c-7e8c92c76bac\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.953622 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fssw\" (UniqueName: \"kubernetes.io/projected/472a52b0-7b96-407f-b36c-7e8c92c76bac-kube-api-access-7fssw\") pod \"472a52b0-7b96-407f-b36c-7e8c92c76bac\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.953664 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-scripts\") pod \"472a52b0-7b96-407f-b36c-7e8c92c76bac\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.953703 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-ceilometer-tls-certs\") pod \"472a52b0-7b96-407f-b36c-7e8c92c76bac\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.953779 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-combined-ca-bundle\") pod \"472a52b0-7b96-407f-b36c-7e8c92c76bac\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.953846 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-sg-core-conf-yaml\") pod \"472a52b0-7b96-407f-b36c-7e8c92c76bac\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.953865 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/472a52b0-7b96-407f-b36c-7e8c92c76bac-run-httpd\") pod \"472a52b0-7b96-407f-b36c-7e8c92c76bac\" (UID: \"472a52b0-7b96-407f-b36c-7e8c92c76bac\") " Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.954018 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/472a52b0-7b96-407f-b36c-7e8c92c76bac-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "472a52b0-7b96-407f-b36c-7e8c92c76bac" (UID: "472a52b0-7b96-407f-b36c-7e8c92c76bac"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.954422 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/472a52b0-7b96-407f-b36c-7e8c92c76bac-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.954947 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/472a52b0-7b96-407f-b36c-7e8c92c76bac-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "472a52b0-7b96-407f-b36c-7e8c92c76bac" (UID: "472a52b0-7b96-407f-b36c-7e8c92c76bac"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.959622 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-scripts" (OuterVolumeSpecName: "scripts") pod "472a52b0-7b96-407f-b36c-7e8c92c76bac" (UID: "472a52b0-7b96-407f-b36c-7e8c92c76bac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.962207 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/472a52b0-7b96-407f-b36c-7e8c92c76bac-kube-api-access-7fssw" (OuterVolumeSpecName: "kube-api-access-7fssw") pod "472a52b0-7b96-407f-b36c-7e8c92c76bac" (UID: "472a52b0-7b96-407f-b36c-7e8c92c76bac"). InnerVolumeSpecName "kube-api-access-7fssw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:48:01 crc kubenswrapper[4781]: I1202 09:48:01.985011 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "472a52b0-7b96-407f-b36c-7e8c92c76bac" (UID: "472a52b0-7b96-407f-b36c-7e8c92c76bac"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.019240 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "472a52b0-7b96-407f-b36c-7e8c92c76bac" (UID: "472a52b0-7b96-407f-b36c-7e8c92c76bac"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.034089 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "472a52b0-7b96-407f-b36c-7e8c92c76bac" (UID: "472a52b0-7b96-407f-b36c-7e8c92c76bac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.055670 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fssw\" (UniqueName: \"kubernetes.io/projected/472a52b0-7b96-407f-b36c-7e8c92c76bac-kube-api-access-7fssw\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.055698 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.055707 4781 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.055716 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.055727 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.055734 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/472a52b0-7b96-407f-b36c-7e8c92c76bac-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.059667 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-config-data" (OuterVolumeSpecName: "config-data") pod "472a52b0-7b96-407f-b36c-7e8c92c76bac" (UID: "472a52b0-7b96-407f-b36c-7e8c92c76bac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.156903 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472a52b0-7b96-407f-b36c-7e8c92c76bac-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.371690 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"472a52b0-7b96-407f-b36c-7e8c92c76bac","Type":"ContainerDied","Data":"70827a8fd73b8ae5439bcd78126055c66aa1fa802bc7a938adf6e3d5346427fa"} Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.371759 4781 scope.go:117] "RemoveContainer" containerID="b0539f61148e24b8a62ccf4da71662962c570e3fe2c906134428f098660f2192" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.371888 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.392344 4781 scope.go:117] "RemoveContainer" containerID="0ad8d61bd436aa2e62a5cc83f773ff2b3fcb23ed8cb91bd0d7458158dfa770e5" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.408256 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.412139 4781 scope.go:117] "RemoveContainer" containerID="0f450970f1446b083f937448b4e2606ec998b88e83260d607b5b09300e66b68e" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.419237 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.434694 4781 scope.go:117] "RemoveContainer" containerID="024bbf683545eb8333abe0203ba1dca85c77f123c8c9df3ac4a2f09f36d3681a" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.439037 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:48:02 crc kubenswrapper[4781]: E1202 09:48:02.439455 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerName="ceilometer-notification-agent" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.439474 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerName="ceilometer-notification-agent" Dec 02 09:48:02 crc kubenswrapper[4781]: E1202 09:48:02.439490 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerName="proxy-httpd" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.439497 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerName="proxy-httpd" Dec 02 09:48:02 crc kubenswrapper[4781]: E1202 09:48:02.439507 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerName="sg-core" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.439515 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerName="sg-core" Dec 02 09:48:02 crc kubenswrapper[4781]: E1202 09:48:02.439557 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerName="ceilometer-central-agent" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.439564 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerName="ceilometer-central-agent" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.439753 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerName="proxy-httpd" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.439772 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerName="sg-core" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.439790 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerName="ceilometer-notification-agent" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.439802 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="472a52b0-7b96-407f-b36c-7e8c92c76bac" containerName="ceilometer-central-agent" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.442141 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.445130 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.445950 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.445970 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.455093 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.564841 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.565511 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.565681 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5df87147-fc67-47b3-9551-1143bf142dc3-log-httpd\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.565838 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.566085 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r67dc\" (UniqueName: \"kubernetes.io/projected/5df87147-fc67-47b3-9551-1143bf142dc3-kube-api-access-r67dc\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.566267 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5df87147-fc67-47b3-9551-1143bf142dc3-run-httpd\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.566407 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-config-data\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.566457 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-scripts\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.668260 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.668329 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r67dc\" (UniqueName: \"kubernetes.io/projected/5df87147-fc67-47b3-9551-1143bf142dc3-kube-api-access-r67dc\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.668365 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5df87147-fc67-47b3-9551-1143bf142dc3-run-httpd\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.668429 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-config-data\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.668867 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5df87147-fc67-47b3-9551-1143bf142dc3-run-httpd\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.669138 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-scripts\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.669271 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.669297 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.669316 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5df87147-fc67-47b3-9551-1143bf142dc3-log-httpd\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.669586 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5df87147-fc67-47b3-9551-1143bf142dc3-log-httpd\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.673236 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-scripts\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.673438 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-config-data\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.673544 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.674133 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.674418 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.686420 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r67dc\" (UniqueName: \"kubernetes.io/projected/5df87147-fc67-47b3-9551-1143bf142dc3-kube-api-access-r67dc\") pod \"ceilometer-0\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.760432 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:48:02 crc kubenswrapper[4781]: I1202 09:48:02.813607 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:48:03 crc kubenswrapper[4781]: I1202 09:48:03.218456 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:48:03 crc kubenswrapper[4781]: I1202 09:48:03.385091 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5df87147-fc67-47b3-9551-1143bf142dc3","Type":"ContainerStarted","Data":"f1ac892e94b805ce3a283e658fed0e5f1b1e24857e37208611ee280cab367580"} Dec 02 09:48:03 crc kubenswrapper[4781]: I1202 09:48:03.520709 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="472a52b0-7b96-407f-b36c-7e8c92c76bac" path="/var/lib/kubelet/pods/472a52b0-7b96-407f-b36c-7e8c92c76bac/volumes" Dec 02 09:48:03 crc kubenswrapper[4781]: I1202 09:48:03.611684 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:48:03 crc kubenswrapper[4781]: I1202 09:48:03.629877 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:48:03 crc kubenswrapper[4781]: I1202 09:48:03.641565 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 09:48:03 crc kubenswrapper[4781]: I1202 09:48:03.641625 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.414651 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.661422 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3a400f59-49d8-4e6d-8a84-24b680b052cf" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.661482 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3a400f59-49d8-4e6d-8a84-24b680b052cf" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.704440 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mhsth"] Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.707311 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mhsth" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.711670 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.711869 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.720040 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mhsth"] Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.813310 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-config-data\") pod \"nova-cell1-cell-mapping-mhsth\" (UID: \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\") " pod="openstack/nova-cell1-cell-mapping-mhsth" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.813643 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lx4p\" (UniqueName: \"kubernetes.io/projected/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-kube-api-access-2lx4p\") pod \"nova-cell1-cell-mapping-mhsth\" (UID: \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\") " pod="openstack/nova-cell1-cell-mapping-mhsth" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.813763 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mhsth\" (UID: \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\") " pod="openstack/nova-cell1-cell-mapping-mhsth" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.814033 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-scripts\") pod \"nova-cell1-cell-mapping-mhsth\" (UID: \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\") " pod="openstack/nova-cell1-cell-mapping-mhsth" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.915409 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mhsth\" (UID: \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\") " pod="openstack/nova-cell1-cell-mapping-mhsth" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.915538 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-scripts\") pod \"nova-cell1-cell-mapping-mhsth\" (UID: \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\") " pod="openstack/nova-cell1-cell-mapping-mhsth" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.915633 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-config-data\") pod \"nova-cell1-cell-mapping-mhsth\" (UID: \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\") " pod="openstack/nova-cell1-cell-mapping-mhsth" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.915684 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lx4p\" (UniqueName: \"kubernetes.io/projected/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-kube-api-access-2lx4p\") pod \"nova-cell1-cell-mapping-mhsth\" (UID: \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\") " pod="openstack/nova-cell1-cell-mapping-mhsth" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.923488 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mhsth\" (UID: \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\") " pod="openstack/nova-cell1-cell-mapping-mhsth" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.923944 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-scripts\") pod \"nova-cell1-cell-mapping-mhsth\" (UID: \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\") " pod="openstack/nova-cell1-cell-mapping-mhsth" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.929606 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-config-data\") pod \"nova-cell1-cell-mapping-mhsth\" (UID: \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\") " pod="openstack/nova-cell1-cell-mapping-mhsth" Dec 02 09:48:04 crc kubenswrapper[4781]: I1202 09:48:04.934324 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lx4p\" (UniqueName: \"kubernetes.io/projected/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-kube-api-access-2lx4p\") pod \"nova-cell1-cell-mapping-mhsth\" (UID: \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\") " pod="openstack/nova-cell1-cell-mapping-mhsth" Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.028575 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mhsth" Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.348767 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": dial tcp 10.217.0.195:8774: connect: connection refused" Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.348784 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": dial tcp 10.217.0.195:8774: connect: connection refused" Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.412908 4781 generic.go:334] "Generic (PLEG): container finished" podID="e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" containerID="478cd18c4c1a4759a237a68b2bac6ad5cd6591aa5d48537bf10f8ec5257aec34" exitCode=0 Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.413888 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d","Type":"ContainerDied","Data":"478cd18c4c1a4759a237a68b2bac6ad5cd6591aa5d48537bf10f8ec5257aec34"} Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.587178 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mhsth"] Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.767090 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.835903 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-combined-ca-bundle\") pod \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\" (UID: \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\") " Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.836100 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8kxz\" (UniqueName: \"kubernetes.io/projected/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-kube-api-access-r8kxz\") pod \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\" (UID: \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\") " Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.836155 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-logs\") pod \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\" (UID: \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\") " Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.836187 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-config-data\") pod \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\" (UID: \"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d\") " Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.837542 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-logs" (OuterVolumeSpecName: "logs") pod "e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" (UID: "e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.848308 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-kube-api-access-r8kxz" (OuterVolumeSpecName: "kube-api-access-r8kxz") pod "e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" (UID: "e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d"). InnerVolumeSpecName "kube-api-access-r8kxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.879266 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" (UID: "e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.885760 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-config-data" (OuterVolumeSpecName: "config-data") pod "e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" (UID: "e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.942456 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.942654 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8kxz\" (UniqueName: \"kubernetes.io/projected/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-kube-api-access-r8kxz\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.942665 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:05 crc kubenswrapper[4781]: I1202 09:48:05.942673 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.427019 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5df87147-fc67-47b3-9551-1143bf142dc3","Type":"ContainerStarted","Data":"75bb95875f158308acd19829d2254a511e24dd067d4195488cff12efd993368f"} Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.429682 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d","Type":"ContainerDied","Data":"3c4eaaf1abd9b3b9c92cefacc1774cc2530c9e2a2d2dbf59ab7af452f054d713"} Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.429743 4781 scope.go:117] "RemoveContainer" containerID="478cd18c4c1a4759a237a68b2bac6ad5cd6591aa5d48537bf10f8ec5257aec34" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.429909 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.442740 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mhsth" event={"ID":"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07","Type":"ContainerStarted","Data":"38c79d0e42445913d308d278577a2e36da560fc134a0ce61a7bacd91900b4887"} Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.443058 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mhsth" event={"ID":"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07","Type":"ContainerStarted","Data":"4dba8a03c1fbddd638761ad863d75ce7134e7b96e523abd7966582545ecc20e8"} Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.463025 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mhsth" podStartSLOduration=2.463006642 podStartE2EDuration="2.463006642s" podCreationTimestamp="2025-12-02 09:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:48:06.461226524 +0000 UTC m=+1649.285100403" watchObservedRunningTime="2025-12-02 09:48:06.463006642 +0000 UTC m=+1649.286880511" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.464796 4781 scope.go:117] "RemoveContainer" containerID="1e9c253db3d4499c4421a0be29f78f56f901c48e43e105abff413d540f00a655" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.486311 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.497960 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.510159 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 09:48:06 crc kubenswrapper[4781]: E1202 09:48:06.510768 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" containerName="nova-api-log" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.510784 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" containerName="nova-api-log" Dec 02 09:48:06 crc kubenswrapper[4781]: E1202 09:48:06.510807 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" containerName="nova-api-api" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.510814 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" containerName="nova-api-api" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.511124 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" containerName="nova-api-api" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.511138 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" containerName="nova-api-log" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.513142 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.523941 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.524836 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.525043 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.542146 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.668798 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-public-tls-certs\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.669084 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.669126 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-logs\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.669302 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.669440 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-config-data\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.669527 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjmhd\" (UniqueName: \"kubernetes.io/projected/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-kube-api-access-tjmhd\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.771237 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjmhd\" (UniqueName: \"kubernetes.io/projected/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-kube-api-access-tjmhd\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.771619 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-public-tls-certs\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.771818 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.771918 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-logs\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.772100 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.772205 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-config-data\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.772448 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-logs\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.776638 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.785403 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.786441 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-public-tls-certs\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.788898 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjmhd\" (UniqueName: \"kubernetes.io/projected/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-kube-api-access-tjmhd\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.794686 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-config-data\") pod \"nova-api-0\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.875281 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:48:06 crc kubenswrapper[4781]: I1202 09:48:06.887150 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:48:07 crc kubenswrapper[4781]: I1202 09:48:07.060938 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-pxpg7"] Dec 02 09:48:07 crc kubenswrapper[4781]: I1202 09:48:07.061434 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" podUID="9045104f-d244-43d4-850a-68e15766bd31" containerName="dnsmasq-dns" containerID="cri-o://5006a11108b361c8fefabb1ec4b44136301695beac4953b4a8b8747be8007e00" gracePeriod=10 Dec 02 09:48:07 crc kubenswrapper[4781]: I1202 09:48:07.459148 4781 generic.go:334] "Generic (PLEG): container finished" podID="9045104f-d244-43d4-850a-68e15766bd31" containerID="5006a11108b361c8fefabb1ec4b44136301695beac4953b4a8b8747be8007e00" exitCode=0 Dec 02 09:48:07 crc kubenswrapper[4781]: I1202 09:48:07.459649 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" event={"ID":"9045104f-d244-43d4-850a-68e15766bd31","Type":"ContainerDied","Data":"5006a11108b361c8fefabb1ec4b44136301695beac4953b4a8b8747be8007e00"} Dec 02 09:48:07 crc kubenswrapper[4781]: I1202 09:48:07.461988 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5df87147-fc67-47b3-9551-1143bf142dc3","Type":"ContainerStarted","Data":"fa52bfea8831183e7bdd882347dc7f8d3ebb6067afe53bd358a1aaa902b59b2d"} Dec 02 09:48:07 crc kubenswrapper[4781]: I1202 09:48:07.538619 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d" path="/var/lib/kubelet/pods/e23b5ff7-d6e3-4cab-a02f-3ba82677cc0d/volumes" Dec 02 09:48:07 crc kubenswrapper[4781]: I1202 09:48:07.576604 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:48:07 crc kubenswrapper[4781]: I1202 09:48:07.683505 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:48:07 crc kubenswrapper[4781]: I1202 09:48:07.805727 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-dns-swift-storage-0\") pod \"9045104f-d244-43d4-850a-68e15766bd31\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " Dec 02 09:48:07 crc kubenswrapper[4781]: I1202 09:48:07.805790 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfvfk\" (UniqueName: \"kubernetes.io/projected/9045104f-d244-43d4-850a-68e15766bd31-kube-api-access-jfvfk\") pod \"9045104f-d244-43d4-850a-68e15766bd31\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " Dec 02 09:48:07 crc kubenswrapper[4781]: I1202 09:48:07.805904 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-config\") pod \"9045104f-d244-43d4-850a-68e15766bd31\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " Dec 02 09:48:07 crc kubenswrapper[4781]: I1202 09:48:07.805951 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-ovsdbserver-sb\") pod \"9045104f-d244-43d4-850a-68e15766bd31\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " Dec 02 09:48:07 crc kubenswrapper[4781]: I1202 09:48:07.806015 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-ovsdbserver-nb\") pod \"9045104f-d244-43d4-850a-68e15766bd31\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " Dec 02 09:48:07 crc kubenswrapper[4781]: I1202 09:48:07.806117 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-dns-svc\") pod \"9045104f-d244-43d4-850a-68e15766bd31\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " Dec 02 09:48:07 crc kubenswrapper[4781]: I1202 09:48:07.825210 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9045104f-d244-43d4-850a-68e15766bd31-kube-api-access-jfvfk" (OuterVolumeSpecName: "kube-api-access-jfvfk") pod "9045104f-d244-43d4-850a-68e15766bd31" (UID: "9045104f-d244-43d4-850a-68e15766bd31"). InnerVolumeSpecName "kube-api-access-jfvfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:48:07 crc kubenswrapper[4781]: I1202 09:48:07.915138 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfvfk\" (UniqueName: \"kubernetes.io/projected/9045104f-d244-43d4-850a-68e15766bd31-kube-api-access-jfvfk\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.015681 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9045104f-d244-43d4-850a-68e15766bd31" (UID: "9045104f-d244-43d4-850a-68e15766bd31"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.016390 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-ovsdbserver-sb\") pod \"9045104f-d244-43d4-850a-68e15766bd31\" (UID: \"9045104f-d244-43d4-850a-68e15766bd31\") " Dec 02 09:48:08 crc kubenswrapper[4781]: W1202 09:48:08.016507 4781 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/9045104f-d244-43d4-850a-68e15766bd31/volumes/kubernetes.io~configmap/ovsdbserver-sb Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.016541 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9045104f-d244-43d4-850a-68e15766bd31" (UID: "9045104f-d244-43d4-850a-68e15766bd31"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.016856 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.031960 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9045104f-d244-43d4-850a-68e15766bd31" (UID: "9045104f-d244-43d4-850a-68e15766bd31"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.037325 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9045104f-d244-43d4-850a-68e15766bd31" (UID: "9045104f-d244-43d4-850a-68e15766bd31"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.043628 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9045104f-d244-43d4-850a-68e15766bd31" (UID: "9045104f-d244-43d4-850a-68e15766bd31"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.045405 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-config" (OuterVolumeSpecName: "config") pod "9045104f-d244-43d4-850a-68e15766bd31" (UID: "9045104f-d244-43d4-850a-68e15766bd31"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.118808 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.118842 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.118853 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.118863 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9045104f-d244-43d4-850a-68e15766bd31-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.475459 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f54cfd5-d3f9-407a-bb1a-c19784895a9a","Type":"ContainerStarted","Data":"b8f3085597c2483289a8675eb8ebeeac6e524bc8509ceed2eac1394fe812ad47"} Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.479291 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5df87147-fc67-47b3-9551-1143bf142dc3","Type":"ContainerStarted","Data":"4d1d623069f33279a56019a48d59b305cfb51bf3e4cff4dffe15edef039c45b0"} Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.489863 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" event={"ID":"9045104f-d244-43d4-850a-68e15766bd31","Type":"ContainerDied","Data":"b8e52ceac0961ddb152a960f8e02e797eb35a80fef11d57d6dafa4b4a933be16"} Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.489954 4781 scope.go:117] "RemoveContainer" containerID="5006a11108b361c8fefabb1ec4b44136301695beac4953b4a8b8747be8007e00" Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.489962 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-pxpg7" Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.517703 4781 scope.go:117] "RemoveContainer" containerID="0d95ae09fa7e13e098dea5878b660c0f6cb0dc9aa2d9615fd43a6be972073afc" Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.543989 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-pxpg7"] Dec 02 09:48:08 crc kubenswrapper[4781]: I1202 09:48:08.553324 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-pxpg7"] Dec 02 09:48:09 crc kubenswrapper[4781]: I1202 09:48:09.512216 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9045104f-d244-43d4-850a-68e15766bd31" path="/var/lib/kubelet/pods/9045104f-d244-43d4-850a-68e15766bd31/volumes" Dec 02 09:48:11 crc kubenswrapper[4781]: I1202 09:48:11.522966 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f54cfd5-d3f9-407a-bb1a-c19784895a9a","Type":"ContainerStarted","Data":"7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15"} Dec 02 09:48:11 crc kubenswrapper[4781]: I1202 09:48:11.524737 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f54cfd5-d3f9-407a-bb1a-c19784895a9a","Type":"ContainerStarted","Data":"95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca"} Dec 02 09:48:11 crc kubenswrapper[4781]: I1202 09:48:11.525228 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5df87147-fc67-47b3-9551-1143bf142dc3","Type":"ContainerStarted","Data":"82b1c66d818f90784da20ce7484de63fc2f1ea5dadda4ad352e844d2737bd608"} Dec 02 09:48:11 crc kubenswrapper[4781]: I1202 09:48:11.525390 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5df87147-fc67-47b3-9551-1143bf142dc3" containerName="ceilometer-central-agent" containerID="cri-o://75bb95875f158308acd19829d2254a511e24dd067d4195488cff12efd993368f" gracePeriod=30 Dec 02 09:48:11 crc kubenswrapper[4781]: I1202 09:48:11.525586 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 09:48:11 crc kubenswrapper[4781]: I1202 09:48:11.525650 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5df87147-fc67-47b3-9551-1143bf142dc3" containerName="proxy-httpd" containerID="cri-o://82b1c66d818f90784da20ce7484de63fc2f1ea5dadda4ad352e844d2737bd608" gracePeriod=30 Dec 02 09:48:11 crc kubenswrapper[4781]: I1202 09:48:11.525695 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5df87147-fc67-47b3-9551-1143bf142dc3" containerName="sg-core" containerID="cri-o://4d1d623069f33279a56019a48d59b305cfb51bf3e4cff4dffe15edef039c45b0" gracePeriod=30 Dec 02 09:48:11 crc kubenswrapper[4781]: I1202 09:48:11.525732 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5df87147-fc67-47b3-9551-1143bf142dc3" containerName="ceilometer-notification-agent" containerID="cri-o://fa52bfea8831183e7bdd882347dc7f8d3ebb6067afe53bd358a1aaa902b59b2d" gracePeriod=30 Dec 02 09:48:11 crc kubenswrapper[4781]: I1202 09:48:11.546128 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=5.54611012 podStartE2EDuration="5.54611012s" podCreationTimestamp="2025-12-02 09:48:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:48:11.54162005 +0000 UTC m=+1654.365493939" watchObservedRunningTime="2025-12-02 09:48:11.54611012 +0000 UTC m=+1654.369983999" Dec 02 09:48:11 crc kubenswrapper[4781]: I1202 09:48:11.580373 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7289037710000001 podStartE2EDuration="9.580350074s" podCreationTimestamp="2025-12-02 09:48:02 +0000 UTC" firstStartedPulling="2025-12-02 09:48:03.221903653 +0000 UTC m=+1646.045777532" lastFinishedPulling="2025-12-02 09:48:11.073349956 +0000 UTC m=+1653.897223835" observedRunningTime="2025-12-02 09:48:11.569857331 +0000 UTC m=+1654.393731210" watchObservedRunningTime="2025-12-02 09:48:11.580350074 +0000 UTC m=+1654.404223953" Dec 02 09:48:12 crc kubenswrapper[4781]: I1202 09:48:12.538856 4781 generic.go:334] "Generic (PLEG): container finished" podID="7e7e36fe-b5d1-4dfa-827a-0bea1b098c07" containerID="38c79d0e42445913d308d278577a2e36da560fc134a0ce61a7bacd91900b4887" exitCode=0 Dec 02 09:48:12 crc kubenswrapper[4781]: I1202 09:48:12.538986 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mhsth" event={"ID":"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07","Type":"ContainerDied","Data":"38c79d0e42445913d308d278577a2e36da560fc134a0ce61a7bacd91900b4887"} Dec 02 09:48:12 crc kubenswrapper[4781]: I1202 09:48:12.543606 4781 generic.go:334] "Generic (PLEG): container finished" podID="5df87147-fc67-47b3-9551-1143bf142dc3" containerID="82b1c66d818f90784da20ce7484de63fc2f1ea5dadda4ad352e844d2737bd608" exitCode=0 Dec 02 09:48:12 crc kubenswrapper[4781]: I1202 09:48:12.543637 4781 generic.go:334] "Generic (PLEG): container finished" podID="5df87147-fc67-47b3-9551-1143bf142dc3" containerID="4d1d623069f33279a56019a48d59b305cfb51bf3e4cff4dffe15edef039c45b0" exitCode=2 Dec 02 09:48:12 crc kubenswrapper[4781]: I1202 09:48:12.543648 4781 generic.go:334] "Generic (PLEG): container finished" podID="5df87147-fc67-47b3-9551-1143bf142dc3" containerID="fa52bfea8831183e7bdd882347dc7f8d3ebb6067afe53bd358a1aaa902b59b2d" exitCode=0 Dec 02 09:48:12 crc kubenswrapper[4781]: I1202 09:48:12.544172 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5df87147-fc67-47b3-9551-1143bf142dc3","Type":"ContainerDied","Data":"82b1c66d818f90784da20ce7484de63fc2f1ea5dadda4ad352e844d2737bd608"} Dec 02 09:48:12 crc kubenswrapper[4781]: I1202 09:48:12.544224 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5df87147-fc67-47b3-9551-1143bf142dc3","Type":"ContainerDied","Data":"4d1d623069f33279a56019a48d59b305cfb51bf3e4cff4dffe15edef039c45b0"} Dec 02 09:48:12 crc kubenswrapper[4781]: I1202 09:48:12.544235 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5df87147-fc67-47b3-9551-1143bf142dc3","Type":"ContainerDied","Data":"fa52bfea8831183e7bdd882347dc7f8d3ebb6067afe53bd358a1aaa902b59b2d"} Dec 02 09:48:13 crc kubenswrapper[4781]: I1202 09:48:13.554456 4781 generic.go:334] "Generic (PLEG): container finished" podID="5df87147-fc67-47b3-9551-1143bf142dc3" containerID="75bb95875f158308acd19829d2254a511e24dd067d4195488cff12efd993368f" exitCode=0 Dec 02 09:48:13 crc kubenswrapper[4781]: I1202 09:48:13.554521 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5df87147-fc67-47b3-9551-1143bf142dc3","Type":"ContainerDied","Data":"75bb95875f158308acd19829d2254a511e24dd067d4195488cff12efd993368f"} Dec 02 09:48:13 crc kubenswrapper[4781]: I1202 09:48:13.646953 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 09:48:13 crc kubenswrapper[4781]: I1202 09:48:13.662041 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 09:48:13 crc kubenswrapper[4781]: I1202 09:48:13.662337 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 09:48:13 crc kubenswrapper[4781]: I1202 09:48:13.998767 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.116206 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mhsth" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.138180 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-config-data\") pod \"5df87147-fc67-47b3-9551-1143bf142dc3\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.138311 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5df87147-fc67-47b3-9551-1143bf142dc3-run-httpd\") pod \"5df87147-fc67-47b3-9551-1143bf142dc3\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.138393 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-ceilometer-tls-certs\") pod \"5df87147-fc67-47b3-9551-1143bf142dc3\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.139575 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5df87147-fc67-47b3-9551-1143bf142dc3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5df87147-fc67-47b3-9551-1143bf142dc3" (UID: "5df87147-fc67-47b3-9551-1143bf142dc3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.140072 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-sg-core-conf-yaml\") pod \"5df87147-fc67-47b3-9551-1143bf142dc3\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.140462 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-combined-ca-bundle\") pod \"5df87147-fc67-47b3-9551-1143bf142dc3\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.140531 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r67dc\" (UniqueName: \"kubernetes.io/projected/5df87147-fc67-47b3-9551-1143bf142dc3-kube-api-access-r67dc\") pod \"5df87147-fc67-47b3-9551-1143bf142dc3\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.140652 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-scripts\") pod \"5df87147-fc67-47b3-9551-1143bf142dc3\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.140724 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5df87147-fc67-47b3-9551-1143bf142dc3-log-httpd\") pod \"5df87147-fc67-47b3-9551-1143bf142dc3\" (UID: \"5df87147-fc67-47b3-9551-1143bf142dc3\") " Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.141472 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5df87147-fc67-47b3-9551-1143bf142dc3-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.141784 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5df87147-fc67-47b3-9551-1143bf142dc3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5df87147-fc67-47b3-9551-1143bf142dc3" (UID: "5df87147-fc67-47b3-9551-1143bf142dc3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.146289 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df87147-fc67-47b3-9551-1143bf142dc3-kube-api-access-r67dc" (OuterVolumeSpecName: "kube-api-access-r67dc") pod "5df87147-fc67-47b3-9551-1143bf142dc3" (UID: "5df87147-fc67-47b3-9551-1143bf142dc3"). InnerVolumeSpecName "kube-api-access-r67dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.152468 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-scripts" (OuterVolumeSpecName: "scripts") pod "5df87147-fc67-47b3-9551-1143bf142dc3" (UID: "5df87147-fc67-47b3-9551-1143bf142dc3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.176285 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5df87147-fc67-47b3-9551-1143bf142dc3" (UID: "5df87147-fc67-47b3-9551-1143bf142dc3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.190966 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5df87147-fc67-47b3-9551-1143bf142dc3" (UID: "5df87147-fc67-47b3-9551-1143bf142dc3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.224999 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5df87147-fc67-47b3-9551-1143bf142dc3" (UID: "5df87147-fc67-47b3-9551-1143bf142dc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.242331 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-config-data\") pod \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\" (UID: \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\") " Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.242407 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lx4p\" (UniqueName: \"kubernetes.io/projected/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-kube-api-access-2lx4p\") pod \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\" (UID: \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\") " Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.242560 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-scripts\") pod \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\" (UID: \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\") " Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.242601 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-combined-ca-bundle\") pod \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\" (UID: \"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07\") " Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.243136 4781 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.243154 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.243163 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.243172 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r67dc\" (UniqueName: \"kubernetes.io/projected/5df87147-fc67-47b3-9551-1143bf142dc3-kube-api-access-r67dc\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.243203 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.243211 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5df87147-fc67-47b3-9551-1143bf142dc3-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.245518 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-kube-api-access-2lx4p" (OuterVolumeSpecName: "kube-api-access-2lx4p") pod "7e7e36fe-b5d1-4dfa-827a-0bea1b098c07" (UID: "7e7e36fe-b5d1-4dfa-827a-0bea1b098c07"). InnerVolumeSpecName "kube-api-access-2lx4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.246147 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-scripts" (OuterVolumeSpecName: "scripts") pod "7e7e36fe-b5d1-4dfa-827a-0bea1b098c07" (UID: "7e7e36fe-b5d1-4dfa-827a-0bea1b098c07"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.257172 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-config-data" (OuterVolumeSpecName: "config-data") pod "5df87147-fc67-47b3-9551-1143bf142dc3" (UID: "5df87147-fc67-47b3-9551-1143bf142dc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.270368 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-config-data" (OuterVolumeSpecName: "config-data") pod "7e7e36fe-b5d1-4dfa-827a-0bea1b098c07" (UID: "7e7e36fe-b5d1-4dfa-827a-0bea1b098c07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.276118 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e7e36fe-b5d1-4dfa-827a-0bea1b098c07" (UID: "7e7e36fe-b5d1-4dfa-827a-0bea1b098c07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.345019 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df87147-fc67-47b3-9551-1143bf142dc3-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.345075 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.345089 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.345100 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.345112 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lx4p\" (UniqueName: \"kubernetes.io/projected/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07-kube-api-access-2lx4p\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.500345 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:48:14 crc kubenswrapper[4781]: E1202 09:48:14.500743 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.567039 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5df87147-fc67-47b3-9551-1143bf142dc3","Type":"ContainerDied","Data":"f1ac892e94b805ce3a283e658fed0e5f1b1e24857e37208611ee280cab367580"} Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.567075 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.567096 4781 scope.go:117] "RemoveContainer" containerID="82b1c66d818f90784da20ce7484de63fc2f1ea5dadda4ad352e844d2737bd608" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.570049 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mhsth" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.575526 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mhsth" event={"ID":"7e7e36fe-b5d1-4dfa-827a-0bea1b098c07","Type":"ContainerDied","Data":"4dba8a03c1fbddd638761ad863d75ce7134e7b96e523abd7966582545ecc20e8"} Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.575586 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dba8a03c1fbddd638761ad863d75ce7134e7b96e523abd7966582545ecc20e8" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.586026 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.668763 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.677190 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.689436 4781 scope.go:117] "RemoveContainer" containerID="4d1d623069f33279a56019a48d59b305cfb51bf3e4cff4dffe15edef039c45b0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.709819 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:48:14 crc kubenswrapper[4781]: E1202 09:48:14.710250 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df87147-fc67-47b3-9551-1143bf142dc3" containerName="proxy-httpd" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.710275 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df87147-fc67-47b3-9551-1143bf142dc3" containerName="proxy-httpd" Dec 02 09:48:14 crc kubenswrapper[4781]: E1202 09:48:14.710293 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e7e36fe-b5d1-4dfa-827a-0bea1b098c07" containerName="nova-manage" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.710303 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e7e36fe-b5d1-4dfa-827a-0bea1b098c07" containerName="nova-manage" Dec 02 09:48:14 crc kubenswrapper[4781]: E1202 09:48:14.710315 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df87147-fc67-47b3-9551-1143bf142dc3" containerName="ceilometer-notification-agent" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.710323 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df87147-fc67-47b3-9551-1143bf142dc3" containerName="ceilometer-notification-agent" Dec 02 09:48:14 crc kubenswrapper[4781]: E1202 09:48:14.710350 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9045104f-d244-43d4-850a-68e15766bd31" containerName="dnsmasq-dns" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.710356 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9045104f-d244-43d4-850a-68e15766bd31" containerName="dnsmasq-dns" Dec 02 09:48:14 crc kubenswrapper[4781]: E1202 09:48:14.710366 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df87147-fc67-47b3-9551-1143bf142dc3" containerName="sg-core" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.710371 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df87147-fc67-47b3-9551-1143bf142dc3" containerName="sg-core" Dec 02 09:48:14 crc kubenswrapper[4781]: E1202 09:48:14.710386 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df87147-fc67-47b3-9551-1143bf142dc3" containerName="ceilometer-central-agent" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.710392 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df87147-fc67-47b3-9551-1143bf142dc3" containerName="ceilometer-central-agent" Dec 02 09:48:14 crc kubenswrapper[4781]: E1202 09:48:14.710400 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9045104f-d244-43d4-850a-68e15766bd31" containerName="init" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.710405 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9045104f-d244-43d4-850a-68e15766bd31" containerName="init" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.710595 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df87147-fc67-47b3-9551-1143bf142dc3" containerName="sg-core" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.710611 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9045104f-d244-43d4-850a-68e15766bd31" containerName="dnsmasq-dns" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.710617 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e7e36fe-b5d1-4dfa-827a-0bea1b098c07" containerName="nova-manage" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.710628 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df87147-fc67-47b3-9551-1143bf142dc3" containerName="ceilometer-central-agent" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.710637 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df87147-fc67-47b3-9551-1143bf142dc3" containerName="proxy-httpd" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.710648 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df87147-fc67-47b3-9551-1143bf142dc3" containerName="ceilometer-notification-agent" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.712369 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.716507 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.716592 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.716817 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.736624 4781 scope.go:117] "RemoveContainer" containerID="fa52bfea8831183e7bdd882347dc7f8d3ebb6067afe53bd358a1aaa902b59b2d" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.741111 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.765745 4781 scope.go:117] "RemoveContainer" containerID="75bb95875f158308acd19829d2254a511e24dd067d4195488cff12efd993368f" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.849848 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.850611 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2f54cfd5-d3f9-407a-bb1a-c19784895a9a" containerName="nova-api-log" containerID="cri-o://95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca" gracePeriod=30 Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.850742 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2f54cfd5-d3f9-407a-bb1a-c19784895a9a" containerName="nova-api-api" containerID="cri-o://7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15" gracePeriod=30 Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.853279 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.853353 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-log-httpd\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.853404 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.853469 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.853538 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-config-data\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.853561 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-scripts\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.853641 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-run-httpd\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.853700 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88jbr\" (UniqueName: \"kubernetes.io/projected/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-kube-api-access-88jbr\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.873988 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.874397 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b6f5e1b7-0c56-447b-b241-56cc0fc9e7af" containerName="nova-scheduler-scheduler" containerID="cri-o://9c5e32171b18f60b0a0543666ce99fdfae3aad608a03d90c9b2bf411e50e7169" gracePeriod=30 Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.922976 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.954902 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.954974 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-log-httpd\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.955006 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.955048 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.955082 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-scripts\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.955096 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-config-data\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.955165 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-run-httpd\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.955198 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88jbr\" (UniqueName: \"kubernetes.io/projected/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-kube-api-access-88jbr\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.955589 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-log-httpd\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.956053 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-run-httpd\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.961596 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-config-data\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.961993 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.963624 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.968595 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-scripts\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.974567 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:14 crc kubenswrapper[4781]: I1202 09:48:14.979988 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88jbr\" (UniqueName: \"kubernetes.io/projected/bbf6f862-1ee5-4bc3-83e5-71c1f72d526c-kube-api-access-88jbr\") pod \"ceilometer-0\" (UID: \"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c\") " pod="openstack/ceilometer-0" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.036305 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.492297 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.513243 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df87147-fc67-47b3-9551-1143bf142dc3" path="/var/lib/kubelet/pods/5df87147-fc67-47b3-9551-1143bf142dc3/volumes" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.572304 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-public-tls-certs\") pod \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.572369 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-combined-ca-bundle\") pod \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.572441 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-config-data\") pod \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.572609 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-internal-tls-certs\") pod \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.572655 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-logs\") pod \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.572680 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjmhd\" (UniqueName: \"kubernetes.io/projected/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-kube-api-access-tjmhd\") pod \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\" (UID: \"2f54cfd5-d3f9-407a-bb1a-c19784895a9a\") " Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.574029 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-logs" (OuterVolumeSpecName: "logs") pod "2f54cfd5-d3f9-407a-bb1a-c19784895a9a" (UID: "2f54cfd5-d3f9-407a-bb1a-c19784895a9a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.578027 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-kube-api-access-tjmhd" (OuterVolumeSpecName: "kube-api-access-tjmhd") pod "2f54cfd5-d3f9-407a-bb1a-c19784895a9a" (UID: "2f54cfd5-d3f9-407a-bb1a-c19784895a9a"). InnerVolumeSpecName "kube-api-access-tjmhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.594822 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f54cfd5-d3f9-407a-bb1a-c19784895a9a" containerID="7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15" exitCode=0 Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.594868 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f54cfd5-d3f9-407a-bb1a-c19784895a9a" containerID="95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca" exitCode=143 Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.595138 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.595203 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f54cfd5-d3f9-407a-bb1a-c19784895a9a","Type":"ContainerDied","Data":"7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15"} Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.595278 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f54cfd5-d3f9-407a-bb1a-c19784895a9a","Type":"ContainerDied","Data":"95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca"} Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.595292 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f54cfd5-d3f9-407a-bb1a-c19784895a9a","Type":"ContainerDied","Data":"b8f3085597c2483289a8675eb8ebeeac6e524bc8509ceed2eac1394fe812ad47"} Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.595315 4781 scope.go:117] "RemoveContainer" containerID="7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.607607 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f54cfd5-d3f9-407a-bb1a-c19784895a9a" (UID: "2f54cfd5-d3f9-407a-bb1a-c19784895a9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.613015 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-config-data" (OuterVolumeSpecName: "config-data") pod "2f54cfd5-d3f9-407a-bb1a-c19784895a9a" (UID: "2f54cfd5-d3f9-407a-bb1a-c19784895a9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.638096 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2f54cfd5-d3f9-407a-bb1a-c19784895a9a" (UID: "2f54cfd5-d3f9-407a-bb1a-c19784895a9a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.666648 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.666813 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2f54cfd5-d3f9-407a-bb1a-c19784895a9a" (UID: "2f54cfd5-d3f9-407a-bb1a-c19784895a9a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.674479 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.674510 4781 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.674523 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.674532 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjmhd\" (UniqueName: \"kubernetes.io/projected/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-kube-api-access-tjmhd\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.674542 4781 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.674554 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f54cfd5-d3f9-407a-bb1a-c19784895a9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.761600 4781 scope.go:117] "RemoveContainer" containerID="95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.779645 4781 scope.go:117] "RemoveContainer" containerID="7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15" Dec 02 09:48:15 crc kubenswrapper[4781]: E1202 09:48:15.780207 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15\": container with ID starting with 7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15 not found: ID does not exist" containerID="7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.780272 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15"} err="failed to get container status \"7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15\": rpc error: code = NotFound desc = could not find container \"7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15\": container with ID starting with 7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15 not found: ID does not exist" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.780306 4781 scope.go:117] "RemoveContainer" containerID="95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca" Dec 02 09:48:15 crc kubenswrapper[4781]: E1202 09:48:15.780616 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca\": container with ID starting with 95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca not found: ID does not exist" containerID="95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.780653 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca"} err="failed to get container status \"95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca\": rpc error: code = NotFound desc = could not find container \"95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca\": container with ID starting with 95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca not found: ID does not exist" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.780680 4781 scope.go:117] "RemoveContainer" containerID="7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.780968 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15"} err="failed to get container status \"7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15\": rpc error: code = NotFound desc = could not find container \"7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15\": container with ID starting with 7c612b6e98f2423ae615988e56d25addfb9f6c0f43289b41e4a56b87917d8f15 not found: ID does not exist" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.781002 4781 scope.go:117] "RemoveContainer" containerID="95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.781322 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca"} err="failed to get container status \"95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca\": rpc error: code = NotFound desc = could not find container \"95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca\": container with ID starting with 95b95d389863f180fdbd399f2de49a399ec57c79586da37da6dc772f4285b1ca not found: ID does not exist" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.926293 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.936112 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.970890 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 09:48:15 crc kubenswrapper[4781]: E1202 09:48:15.971279 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f54cfd5-d3f9-407a-bb1a-c19784895a9a" containerName="nova-api-api" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.971295 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f54cfd5-d3f9-407a-bb1a-c19784895a9a" containerName="nova-api-api" Dec 02 09:48:15 crc kubenswrapper[4781]: E1202 09:48:15.971316 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f54cfd5-d3f9-407a-bb1a-c19784895a9a" containerName="nova-api-log" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.971322 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f54cfd5-d3f9-407a-bb1a-c19784895a9a" containerName="nova-api-log" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.971489 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f54cfd5-d3f9-407a-bb1a-c19784895a9a" containerName="nova-api-api" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.971521 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f54cfd5-d3f9-407a-bb1a-c19784895a9a" containerName="nova-api-log" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.972520 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.975063 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.976172 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 09:48:15 crc kubenswrapper[4781]: I1202 09:48:15.982066 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.000442 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.081465 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgkxw\" (UniqueName: \"kubernetes.io/projected/18fa078a-6d45-40cf-a39e-139d84f86f76-kube-api-access-cgkxw\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.081570 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18fa078a-6d45-40cf-a39e-139d84f86f76-config-data\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.081804 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18fa078a-6d45-40cf-a39e-139d84f86f76-public-tls-certs\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.081878 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18fa078a-6d45-40cf-a39e-139d84f86f76-logs\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.081913 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18fa078a-6d45-40cf-a39e-139d84f86f76-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.082023 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18fa078a-6d45-40cf-a39e-139d84f86f76-internal-tls-certs\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.184345 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18fa078a-6d45-40cf-a39e-139d84f86f76-config-data\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.184463 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18fa078a-6d45-40cf-a39e-139d84f86f76-public-tls-certs\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.184499 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18fa078a-6d45-40cf-a39e-139d84f86f76-logs\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.184518 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18fa078a-6d45-40cf-a39e-139d84f86f76-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.184538 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18fa078a-6d45-40cf-a39e-139d84f86f76-internal-tls-certs\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.184657 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgkxw\" (UniqueName: \"kubernetes.io/projected/18fa078a-6d45-40cf-a39e-139d84f86f76-kube-api-access-cgkxw\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.185077 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18fa078a-6d45-40cf-a39e-139d84f86f76-logs\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.192598 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18fa078a-6d45-40cf-a39e-139d84f86f76-internal-tls-certs\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.209542 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18fa078a-6d45-40cf-a39e-139d84f86f76-public-tls-certs\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.217568 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18fa078a-6d45-40cf-a39e-139d84f86f76-config-data\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.219681 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgkxw\" (UniqueName: \"kubernetes.io/projected/18fa078a-6d45-40cf-a39e-139d84f86f76-kube-api-access-cgkxw\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.224117 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18fa078a-6d45-40cf-a39e-139d84f86f76-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"18fa078a-6d45-40cf-a39e-139d84f86f76\") " pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.292544 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.618384 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c","Type":"ContainerStarted","Data":"a13861687d483da22acc9a41d80a6ecab90d978b85837fce17e09d8640dbdd5b"} Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.624497 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3a400f59-49d8-4e6d-8a84-24b680b052cf" containerName="nova-metadata-log" containerID="cri-o://bd092bc04703af445a96d0dc6ffd7109012effad2faa8cdf4308be22814ad920" gracePeriod=30 Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.624778 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3a400f59-49d8-4e6d-8a84-24b680b052cf" containerName="nova-metadata-metadata" containerID="cri-o://08a35e39a052bff33bcbe0911370ab32856466d1eb578f0adc35360e7e231e4d" gracePeriod=30 Dec 02 09:48:16 crc kubenswrapper[4781]: I1202 09:48:16.799870 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:48:16 crc kubenswrapper[4781]: W1202 09:48:16.807211 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18fa078a_6d45_40cf_a39e_139d84f86f76.slice/crio-b698a042b36ba5bb35da2b5d64afcc39258090719ce03485b29d81da528b03d1 WatchSource:0}: Error finding container b698a042b36ba5bb35da2b5d64afcc39258090719ce03485b29d81da528b03d1: Status 404 returned error can't find the container with id b698a042b36ba5bb35da2b5d64afcc39258090719ce03485b29d81da528b03d1 Dec 02 09:48:17 crc kubenswrapper[4781]: I1202 09:48:17.520396 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f54cfd5-d3f9-407a-bb1a-c19784895a9a" path="/var/lib/kubelet/pods/2f54cfd5-d3f9-407a-bb1a-c19784895a9a/volumes" Dec 02 09:48:17 crc kubenswrapper[4781]: I1202 09:48:17.637583 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"18fa078a-6d45-40cf-a39e-139d84f86f76","Type":"ContainerStarted","Data":"f90db60bf5a9389aee0cb1eedda165b4dd62b919badbabf1532ebcb85fd19ecb"} Dec 02 09:48:17 crc kubenswrapper[4781]: I1202 09:48:17.637893 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"18fa078a-6d45-40cf-a39e-139d84f86f76","Type":"ContainerStarted","Data":"1ada40ecc9059b003a32bd8ccf42d09491b6011dbd758d8986991a22ec16b4a4"} Dec 02 09:48:17 crc kubenswrapper[4781]: I1202 09:48:17.637908 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"18fa078a-6d45-40cf-a39e-139d84f86f76","Type":"ContainerStarted","Data":"b698a042b36ba5bb35da2b5d64afcc39258090719ce03485b29d81da528b03d1"} Dec 02 09:48:17 crc kubenswrapper[4781]: I1202 09:48:17.640866 4781 generic.go:334] "Generic (PLEG): container finished" podID="3a400f59-49d8-4e6d-8a84-24b680b052cf" containerID="bd092bc04703af445a96d0dc6ffd7109012effad2faa8cdf4308be22814ad920" exitCode=143 Dec 02 09:48:17 crc kubenswrapper[4781]: I1202 09:48:17.640942 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a400f59-49d8-4e6d-8a84-24b680b052cf","Type":"ContainerDied","Data":"bd092bc04703af445a96d0dc6ffd7109012effad2faa8cdf4308be22814ad920"} Dec 02 09:48:17 crc kubenswrapper[4781]: I1202 09:48:17.646020 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c","Type":"ContainerStarted","Data":"aca3000b3434cd0670192094b2c139a364871c20033f842a04c2868967439a61"} Dec 02 09:48:17 crc kubenswrapper[4781]: I1202 09:48:17.646053 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c","Type":"ContainerStarted","Data":"488cf80b14511e46a199921ef1912f6dc50e613bbfd8cefc3f37fc08cff20ac3"} Dec 02 09:48:18 crc kubenswrapper[4781]: I1202 09:48:18.659688 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c","Type":"ContainerStarted","Data":"271b02e7ce0a59f9850fead76781bc3cb820bd4c2a087f5758bd70ac6188f976"} Dec 02 09:48:19 crc kubenswrapper[4781]: E1202 09:48:19.298941 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c5e32171b18f60b0a0543666ce99fdfae3aad608a03d90c9b2bf411e50e7169 is running failed: container process not found" containerID="9c5e32171b18f60b0a0543666ce99fdfae3aad608a03d90c9b2bf411e50e7169" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 09:48:19 crc kubenswrapper[4781]: E1202 09:48:19.299686 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c5e32171b18f60b0a0543666ce99fdfae3aad608a03d90c9b2bf411e50e7169 is running failed: container process not found" containerID="9c5e32171b18f60b0a0543666ce99fdfae3aad608a03d90c9b2bf411e50e7169" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 09:48:19 crc kubenswrapper[4781]: E1202 09:48:19.300144 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c5e32171b18f60b0a0543666ce99fdfae3aad608a03d90c9b2bf411e50e7169 is running failed: container process not found" containerID="9c5e32171b18f60b0a0543666ce99fdfae3aad608a03d90c9b2bf411e50e7169" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 09:48:19 crc kubenswrapper[4781]: E1202 09:48:19.300191 4781 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c5e32171b18f60b0a0543666ce99fdfae3aad608a03d90c9b2bf411e50e7169 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b6f5e1b7-0c56-447b-b241-56cc0fc9e7af" containerName="nova-scheduler-scheduler" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:19.678794 4781 generic.go:334] "Generic (PLEG): container finished" podID="b6f5e1b7-0c56-447b-b241-56cc0fc9e7af" containerID="9c5e32171b18f60b0a0543666ce99fdfae3aad608a03d90c9b2bf411e50e7169" exitCode=0 Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:19.678823 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af","Type":"ContainerDied","Data":"9c5e32171b18f60b0a0543666ce99fdfae3aad608a03d90c9b2bf411e50e7169"} Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:19.678902 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af","Type":"ContainerDied","Data":"e6d99c1d80c8ff13e675cebc9f3364728e06b28f25777c871cd86c62bd1d13f2"} Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:19.678941 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6d99c1d80c8ff13e675cebc9f3364728e06b28f25777c871cd86c62bd1d13f2" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:19.679572 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:19.701669 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.701643001 podStartE2EDuration="4.701643001s" podCreationTimestamp="2025-12-02 09:48:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:48:17.669894196 +0000 UTC m=+1660.493768085" watchObservedRunningTime="2025-12-02 09:48:19.701643001 +0000 UTC m=+1662.525516880" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:19.775025 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3a400f59-49d8-4e6d-8a84-24b680b052cf" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": read tcp 10.217.0.2:36218->10.217.0.200:8775: read: connection reset by peer" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:19.775033 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3a400f59-49d8-4e6d-8a84-24b680b052cf" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": read tcp 10.217.0.2:36220->10.217.0.200:8775: read: connection reset by peer" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:19.788158 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2mmb\" (UniqueName: \"kubernetes.io/projected/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-kube-api-access-l2mmb\") pod \"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af\" (UID: \"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af\") " Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:19.788278 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-combined-ca-bundle\") pod \"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af\" (UID: \"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af\") " Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:19.788363 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-config-data\") pod \"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af\" (UID: \"b6f5e1b7-0c56-447b-b241-56cc0fc9e7af\") " Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:19.795403 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-kube-api-access-l2mmb" (OuterVolumeSpecName: "kube-api-access-l2mmb") pod "b6f5e1b7-0c56-447b-b241-56cc0fc9e7af" (UID: "b6f5e1b7-0c56-447b-b241-56cc0fc9e7af"). InnerVolumeSpecName "kube-api-access-l2mmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:19.830045 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-config-data" (OuterVolumeSpecName: "config-data") pod "b6f5e1b7-0c56-447b-b241-56cc0fc9e7af" (UID: "b6f5e1b7-0c56-447b-b241-56cc0fc9e7af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:19.847704 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6f5e1b7-0c56-447b-b241-56cc0fc9e7af" (UID: "b6f5e1b7-0c56-447b-b241-56cc0fc9e7af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:19.890693 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2mmb\" (UniqueName: \"kubernetes.io/projected/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-kube-api-access-l2mmb\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:19.890734 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:19.890744 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.690730 4781 generic.go:334] "Generic (PLEG): container finished" podID="3a400f59-49d8-4e6d-8a84-24b680b052cf" containerID="08a35e39a052bff33bcbe0911370ab32856466d1eb578f0adc35360e7e231e4d" exitCode=0 Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.690817 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a400f59-49d8-4e6d-8a84-24b680b052cf","Type":"ContainerDied","Data":"08a35e39a052bff33bcbe0911370ab32856466d1eb578f0adc35360e7e231e4d"} Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.691027 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a400f59-49d8-4e6d-8a84-24b680b052cf","Type":"ContainerDied","Data":"d6f7e4a728b6bcfde8954710b1b328267e2f8e761fcef8253ab3fff50256b581"} Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.691043 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6f7e4a728b6bcfde8954710b1b328267e2f8e761fcef8253ab3fff50256b581" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.694261 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.694260 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbf6f862-1ee5-4bc3-83e5-71c1f72d526c","Type":"ContainerStarted","Data":"c2faf13cbf80388d9b8fce52224847a92ed98d7b245dc787e1b4ca229f2bc777"} Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.694372 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.722101 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.399711493 podStartE2EDuration="6.722084432s" podCreationTimestamp="2025-12-02 09:48:14 +0000 UTC" firstStartedPulling="2025-12-02 09:48:15.676602528 +0000 UTC m=+1658.500476407" lastFinishedPulling="2025-12-02 09:48:19.998975467 +0000 UTC m=+1662.822849346" observedRunningTime="2025-12-02 09:48:20.718172486 +0000 UTC m=+1663.542046365" watchObservedRunningTime="2025-12-02 09:48:20.722084432 +0000 UTC m=+1663.545958311" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.734654 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.753323 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.761601 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.774488 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:48:20 crc kubenswrapper[4781]: E1202 09:48:20.774873 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f5e1b7-0c56-447b-b241-56cc0fc9e7af" containerName="nova-scheduler-scheduler" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.774886 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f5e1b7-0c56-447b-b241-56cc0fc9e7af" containerName="nova-scheduler-scheduler" Dec 02 09:48:20 crc kubenswrapper[4781]: E1202 09:48:20.774910 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a400f59-49d8-4e6d-8a84-24b680b052cf" containerName="nova-metadata-log" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.774917 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a400f59-49d8-4e6d-8a84-24b680b052cf" containerName="nova-metadata-log" Dec 02 09:48:20 crc kubenswrapper[4781]: E1202 09:48:20.774953 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a400f59-49d8-4e6d-8a84-24b680b052cf" containerName="nova-metadata-metadata" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.774960 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a400f59-49d8-4e6d-8a84-24b680b052cf" containerName="nova-metadata-metadata" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.775129 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a400f59-49d8-4e6d-8a84-24b680b052cf" containerName="nova-metadata-metadata" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.775149 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a400f59-49d8-4e6d-8a84-24b680b052cf" containerName="nova-metadata-log" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.775160 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6f5e1b7-0c56-447b-b241-56cc0fc9e7af" containerName="nova-scheduler-scheduler" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.775792 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.779437 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.807186 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.808981 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-config-data\") pod \"3a400f59-49d8-4e6d-8a84-24b680b052cf\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.809034 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4mhw\" (UniqueName: \"kubernetes.io/projected/3a400f59-49d8-4e6d-8a84-24b680b052cf-kube-api-access-x4mhw\") pod \"3a400f59-49d8-4e6d-8a84-24b680b052cf\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.809091 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-nova-metadata-tls-certs\") pod \"3a400f59-49d8-4e6d-8a84-24b680b052cf\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.809205 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a400f59-49d8-4e6d-8a84-24b680b052cf-logs\") pod \"3a400f59-49d8-4e6d-8a84-24b680b052cf\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.809249 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-combined-ca-bundle\") pod \"3a400f59-49d8-4e6d-8a84-24b680b052cf\" (UID: \"3a400f59-49d8-4e6d-8a84-24b680b052cf\") " Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.809686 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a400f59-49d8-4e6d-8a84-24b680b052cf-logs" (OuterVolumeSpecName: "logs") pod "3a400f59-49d8-4e6d-8a84-24b680b052cf" (UID: "3a400f59-49d8-4e6d-8a84-24b680b052cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.822911 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a400f59-49d8-4e6d-8a84-24b680b052cf-kube-api-access-x4mhw" (OuterVolumeSpecName: "kube-api-access-x4mhw") pod "3a400f59-49d8-4e6d-8a84-24b680b052cf" (UID: "3a400f59-49d8-4e6d-8a84-24b680b052cf"). InnerVolumeSpecName "kube-api-access-x4mhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.862039 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-config-data" (OuterVolumeSpecName: "config-data") pod "3a400f59-49d8-4e6d-8a84-24b680b052cf" (UID: "3a400f59-49d8-4e6d-8a84-24b680b052cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.862096 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a400f59-49d8-4e6d-8a84-24b680b052cf" (UID: "3a400f59-49d8-4e6d-8a84-24b680b052cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.880211 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3a400f59-49d8-4e6d-8a84-24b680b052cf" (UID: "3a400f59-49d8-4e6d-8a84-24b680b052cf"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.911304 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4460f62-792a-4297-b92b-3fe1081bc006-config-data\") pod \"nova-scheduler-0\" (UID: \"a4460f62-792a-4297-b92b-3fe1081bc006\") " pod="openstack/nova-scheduler-0" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.911365 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4460f62-792a-4297-b92b-3fe1081bc006-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a4460f62-792a-4297-b92b-3fe1081bc006\") " pod="openstack/nova-scheduler-0" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.911412 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfqpk\" (UniqueName: \"kubernetes.io/projected/a4460f62-792a-4297-b92b-3fe1081bc006-kube-api-access-lfqpk\") pod \"nova-scheduler-0\" (UID: \"a4460f62-792a-4297-b92b-3fe1081bc006\") " pod="openstack/nova-scheduler-0" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.911495 4781 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.911506 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a400f59-49d8-4e6d-8a84-24b680b052cf-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.911515 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.911523 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a400f59-49d8-4e6d-8a84-24b680b052cf-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:20 crc kubenswrapper[4781]: I1202 09:48:20.911532 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4mhw\" (UniqueName: \"kubernetes.io/projected/3a400f59-49d8-4e6d-8a84-24b680b052cf-kube-api-access-x4mhw\") on node \"crc\" DevicePath \"\"" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.013131 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4460f62-792a-4297-b92b-3fe1081bc006-config-data\") pod \"nova-scheduler-0\" (UID: \"a4460f62-792a-4297-b92b-3fe1081bc006\") " pod="openstack/nova-scheduler-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.013197 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4460f62-792a-4297-b92b-3fe1081bc006-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a4460f62-792a-4297-b92b-3fe1081bc006\") " pod="openstack/nova-scheduler-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.013247 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfqpk\" (UniqueName: \"kubernetes.io/projected/a4460f62-792a-4297-b92b-3fe1081bc006-kube-api-access-lfqpk\") pod \"nova-scheduler-0\" (UID: \"a4460f62-792a-4297-b92b-3fe1081bc006\") " pod="openstack/nova-scheduler-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.019592 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4460f62-792a-4297-b92b-3fe1081bc006-config-data\") pod \"nova-scheduler-0\" (UID: \"a4460f62-792a-4297-b92b-3fe1081bc006\") " pod="openstack/nova-scheduler-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.020682 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4460f62-792a-4297-b92b-3fe1081bc006-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a4460f62-792a-4297-b92b-3fe1081bc006\") " pod="openstack/nova-scheduler-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.036498 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfqpk\" (UniqueName: \"kubernetes.io/projected/a4460f62-792a-4297-b92b-3fe1081bc006-kube-api-access-lfqpk\") pod \"nova-scheduler-0\" (UID: \"a4460f62-792a-4297-b92b-3fe1081bc006\") " pod="openstack/nova-scheduler-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.092689 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.525718 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6f5e1b7-0c56-447b-b241-56cc0fc9e7af" path="/var/lib/kubelet/pods/b6f5e1b7-0c56-447b-b241-56cc0fc9e7af/volumes" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.570919 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.702758 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a4460f62-792a-4297-b92b-3fe1081bc006","Type":"ContainerStarted","Data":"f3d9f60e1f627fb477891d9abdd198869215636a9452d463febcac5b9432cf81"} Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.702902 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.734705 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.747560 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.756603 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.758194 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.761672 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.762444 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.767702 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.828751 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b701c328-f693-4c11-96a7-5ff5b9bff2c1-config-data\") pod \"nova-metadata-0\" (UID: \"b701c328-f693-4c11-96a7-5ff5b9bff2c1\") " pod="openstack/nova-metadata-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.828941 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b701c328-f693-4c11-96a7-5ff5b9bff2c1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b701c328-f693-4c11-96a7-5ff5b9bff2c1\") " pod="openstack/nova-metadata-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.828965 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr7z4\" (UniqueName: \"kubernetes.io/projected/b701c328-f693-4c11-96a7-5ff5b9bff2c1-kube-api-access-tr7z4\") pod \"nova-metadata-0\" (UID: \"b701c328-f693-4c11-96a7-5ff5b9bff2c1\") " pod="openstack/nova-metadata-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.829265 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b701c328-f693-4c11-96a7-5ff5b9bff2c1-logs\") pod \"nova-metadata-0\" (UID: \"b701c328-f693-4c11-96a7-5ff5b9bff2c1\") " pod="openstack/nova-metadata-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.829428 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b701c328-f693-4c11-96a7-5ff5b9bff2c1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b701c328-f693-4c11-96a7-5ff5b9bff2c1\") " pod="openstack/nova-metadata-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.931725 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b701c328-f693-4c11-96a7-5ff5b9bff2c1-logs\") pod \"nova-metadata-0\" (UID: \"b701c328-f693-4c11-96a7-5ff5b9bff2c1\") " pod="openstack/nova-metadata-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.931824 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b701c328-f693-4c11-96a7-5ff5b9bff2c1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b701c328-f693-4c11-96a7-5ff5b9bff2c1\") " pod="openstack/nova-metadata-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.931864 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b701c328-f693-4c11-96a7-5ff5b9bff2c1-config-data\") pod \"nova-metadata-0\" (UID: \"b701c328-f693-4c11-96a7-5ff5b9bff2c1\") " pod="openstack/nova-metadata-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.931943 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b701c328-f693-4c11-96a7-5ff5b9bff2c1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b701c328-f693-4c11-96a7-5ff5b9bff2c1\") " pod="openstack/nova-metadata-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.931963 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr7z4\" (UniqueName: \"kubernetes.io/projected/b701c328-f693-4c11-96a7-5ff5b9bff2c1-kube-api-access-tr7z4\") pod \"nova-metadata-0\" (UID: \"b701c328-f693-4c11-96a7-5ff5b9bff2c1\") " pod="openstack/nova-metadata-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.932291 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b701c328-f693-4c11-96a7-5ff5b9bff2c1-logs\") pod \"nova-metadata-0\" (UID: \"b701c328-f693-4c11-96a7-5ff5b9bff2c1\") " pod="openstack/nova-metadata-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.935645 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b701c328-f693-4c11-96a7-5ff5b9bff2c1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b701c328-f693-4c11-96a7-5ff5b9bff2c1\") " pod="openstack/nova-metadata-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.943447 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b701c328-f693-4c11-96a7-5ff5b9bff2c1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b701c328-f693-4c11-96a7-5ff5b9bff2c1\") " pod="openstack/nova-metadata-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.944550 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b701c328-f693-4c11-96a7-5ff5b9bff2c1-config-data\") pod \"nova-metadata-0\" (UID: \"b701c328-f693-4c11-96a7-5ff5b9bff2c1\") " pod="openstack/nova-metadata-0" Dec 02 09:48:21 crc kubenswrapper[4781]: I1202 09:48:21.948598 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr7z4\" (UniqueName: \"kubernetes.io/projected/b701c328-f693-4c11-96a7-5ff5b9bff2c1-kube-api-access-tr7z4\") pod \"nova-metadata-0\" (UID: \"b701c328-f693-4c11-96a7-5ff5b9bff2c1\") " pod="openstack/nova-metadata-0" Dec 02 09:48:22 crc kubenswrapper[4781]: I1202 09:48:22.076032 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 09:48:22 crc kubenswrapper[4781]: I1202 09:48:22.535037 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:48:22 crc kubenswrapper[4781]: W1202 09:48:22.543942 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb701c328_f693_4c11_96a7_5ff5b9bff2c1.slice/crio-260aa0b382dfe040c899d121005f895d1e28f9ed2e0eb2b1010c25f9ed4d7cfe WatchSource:0}: Error finding container 260aa0b382dfe040c899d121005f895d1e28f9ed2e0eb2b1010c25f9ed4d7cfe: Status 404 returned error can't find the container with id 260aa0b382dfe040c899d121005f895d1e28f9ed2e0eb2b1010c25f9ed4d7cfe Dec 02 09:48:22 crc kubenswrapper[4781]: I1202 09:48:22.712038 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b701c328-f693-4c11-96a7-5ff5b9bff2c1","Type":"ContainerStarted","Data":"260aa0b382dfe040c899d121005f895d1e28f9ed2e0eb2b1010c25f9ed4d7cfe"} Dec 02 09:48:22 crc kubenswrapper[4781]: I1202 09:48:22.713704 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a4460f62-792a-4297-b92b-3fe1081bc006","Type":"ContainerStarted","Data":"93ff4973af4de793de21241182241a9dbe9aa1e23d7495965e0598f446e7502f"} Dec 02 09:48:22 crc kubenswrapper[4781]: I1202 09:48:22.745543 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.745520383 podStartE2EDuration="2.745520383s" podCreationTimestamp="2025-12-02 09:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:48:22.73242445 +0000 UTC m=+1665.556298329" watchObservedRunningTime="2025-12-02 09:48:22.745520383 +0000 UTC m=+1665.569394252" Dec 02 09:48:23 crc kubenswrapper[4781]: I1202 09:48:23.512704 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a400f59-49d8-4e6d-8a84-24b680b052cf" path="/var/lib/kubelet/pods/3a400f59-49d8-4e6d-8a84-24b680b052cf/volumes" Dec 02 09:48:23 crc kubenswrapper[4781]: I1202 09:48:23.723657 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b701c328-f693-4c11-96a7-5ff5b9bff2c1","Type":"ContainerStarted","Data":"c17906827fbbd0b43719d84f64f02c426f419a1bcfdcc66e6638084e79b59682"} Dec 02 09:48:23 crc kubenswrapper[4781]: I1202 09:48:23.723711 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b701c328-f693-4c11-96a7-5ff5b9bff2c1","Type":"ContainerStarted","Data":"66933dbc3e4ec35ea03a63ea33d8d732e8e646a9abc143fa154c2f37e35b4741"} Dec 02 09:48:23 crc kubenswrapper[4781]: I1202 09:48:23.749374 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.749338335 podStartE2EDuration="2.749338335s" podCreationTimestamp="2025-12-02 09:48:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:48:23.739222513 +0000 UTC m=+1666.563096402" watchObservedRunningTime="2025-12-02 09:48:23.749338335 +0000 UTC m=+1666.573212214" Dec 02 09:48:26 crc kubenswrapper[4781]: I1202 09:48:26.092900 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 09:48:26 crc kubenswrapper[4781]: I1202 09:48:26.293115 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 09:48:26 crc kubenswrapper[4781]: I1202 09:48:26.293387 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 09:48:27 crc kubenswrapper[4781]: I1202 09:48:27.077208 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 09:48:27 crc kubenswrapper[4781]: I1202 09:48:27.077278 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 09:48:27 crc kubenswrapper[4781]: I1202 09:48:27.309152 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="18fa078a-6d45-40cf-a39e-139d84f86f76" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 09:48:27 crc kubenswrapper[4781]: I1202 09:48:27.309184 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="18fa078a-6d45-40cf-a39e-139d84f86f76" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 09:48:27 crc kubenswrapper[4781]: I1202 09:48:27.505501 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:48:27 crc kubenswrapper[4781]: E1202 09:48:27.505827 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:48:31 crc kubenswrapper[4781]: I1202 09:48:31.093978 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 09:48:31 crc kubenswrapper[4781]: I1202 09:48:31.125753 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 09:48:31 crc kubenswrapper[4781]: I1202 09:48:31.844453 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 09:48:32 crc kubenswrapper[4781]: I1202 09:48:32.076456 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 09:48:32 crc kubenswrapper[4781]: I1202 09:48:32.076905 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 09:48:33 crc kubenswrapper[4781]: I1202 09:48:33.093131 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b701c328-f693-4c11-96a7-5ff5b9bff2c1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 09:48:33 crc kubenswrapper[4781]: I1202 09:48:33.093169 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b701c328-f693-4c11-96a7-5ff5b9bff2c1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 09:48:36 crc kubenswrapper[4781]: I1202 09:48:36.299314 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 09:48:36 crc kubenswrapper[4781]: I1202 09:48:36.300508 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 09:48:36 crc kubenswrapper[4781]: I1202 09:48:36.301658 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 09:48:36 crc kubenswrapper[4781]: I1202 09:48:36.306470 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 09:48:36 crc kubenswrapper[4781]: I1202 09:48:36.869730 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 09:48:36 crc kubenswrapper[4781]: I1202 09:48:36.877365 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 09:48:40 crc kubenswrapper[4781]: I1202 09:48:40.499717 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:48:40 crc kubenswrapper[4781]: E1202 09:48:40.500547 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:48:42 crc kubenswrapper[4781]: I1202 09:48:42.085856 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 09:48:42 crc kubenswrapper[4781]: I1202 09:48:42.087192 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 09:48:42 crc kubenswrapper[4781]: I1202 09:48:42.091487 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 09:48:42 crc kubenswrapper[4781]: I1202 09:48:42.928440 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 09:48:45 crc kubenswrapper[4781]: I1202 09:48:45.046146 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 09:48:51 crc kubenswrapper[4781]: I1202 09:48:51.499441 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:48:51 crc kubenswrapper[4781]: E1202 09:48:51.500183 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:48:54 crc kubenswrapper[4781]: I1202 09:48:54.855199 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 09:48:56 crc kubenswrapper[4781]: I1202 09:48:56.214062 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 09:49:00 crc kubenswrapper[4781]: I1202 09:49:00.094707 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="9ef9f823-7456-4494-85c7-d29fcf35e7b5" containerName="rabbitmq" containerID="cri-o://8859e4a4c2bfa9375f419be864c8f71d03a958b1c01be72f1c1bf549a3c73cec" gracePeriod=604795 Dec 02 09:49:00 crc kubenswrapper[4781]: I1202 09:49:00.552551 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="4d040259-d968-45a1-832a-45586a9fe0d1" containerName="rabbitmq" containerID="cri-o://29d9e8eaac33a1eec6f376ec46ba525e7cb820604a4b8f6d204d2bf0ffeea97d" gracePeriod=604796 Dec 02 09:49:00 crc kubenswrapper[4781]: I1202 09:49:00.628328 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="9ef9f823-7456-4494-85c7-d29fcf35e7b5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.96:5671: connect: connection refused" Dec 02 09:49:00 crc kubenswrapper[4781]: I1202 09:49:00.699745 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="4d040259-d968-45a1-832a-45586a9fe0d1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.97:5671: connect: connection refused" Dec 02 09:49:04 crc kubenswrapper[4781]: I1202 09:49:04.500373 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:49:04 crc kubenswrapper[4781]: E1202 09:49:04.501222 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:49:10 crc kubenswrapper[4781]: I1202 09:49:10.628742 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="9ef9f823-7456-4494-85c7-d29fcf35e7b5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.96:5671: connect: connection refused" Dec 02 09:49:10 crc kubenswrapper[4781]: I1202 09:49:10.699805 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="4d040259-d968-45a1-832a-45586a9fe0d1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.97:5671: connect: connection refused" Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.834241 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.946376 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkktl\" (UniqueName: \"kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-kube-api-access-qkktl\") pod \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.946464 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-confd\") pod \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.946538 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.946589 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ef9f823-7456-4494-85c7-d29fcf35e7b5-pod-info\") pod \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.946623 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-plugins-conf\") pod \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.946653 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-tls\") pod \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.946705 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-config-data\") pod \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.946744 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-erlang-cookie\") pod \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.946766 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-server-conf\") pod \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.946809 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ef9f823-7456-4494-85c7-d29fcf35e7b5-erlang-cookie-secret\") pod \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.946853 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-plugins\") pod \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\" (UID: \"9ef9f823-7456-4494-85c7-d29fcf35e7b5\") " Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.947794 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9ef9f823-7456-4494-85c7-d29fcf35e7b5" (UID: "9ef9f823-7456-4494-85c7-d29fcf35e7b5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.948078 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9ef9f823-7456-4494-85c7-d29fcf35e7b5" (UID: "9ef9f823-7456-4494-85c7-d29fcf35e7b5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.948451 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9ef9f823-7456-4494-85c7-d29fcf35e7b5" (UID: "9ef9f823-7456-4494-85c7-d29fcf35e7b5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.971452 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9ef9f823-7456-4494-85c7-d29fcf35e7b5-pod-info" (OuterVolumeSpecName: "pod-info") pod "9ef9f823-7456-4494-85c7-d29fcf35e7b5" (UID: "9ef9f823-7456-4494-85c7-d29fcf35e7b5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.973363 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-kube-api-access-qkktl" (OuterVolumeSpecName: "kube-api-access-qkktl") pod "9ef9f823-7456-4494-85c7-d29fcf35e7b5" (UID: "9ef9f823-7456-4494-85c7-d29fcf35e7b5"). InnerVolumeSpecName "kube-api-access-qkktl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.973529 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "9ef9f823-7456-4494-85c7-d29fcf35e7b5" (UID: "9ef9f823-7456-4494-85c7-d29fcf35e7b5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.982951 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef9f823-7456-4494-85c7-d29fcf35e7b5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9ef9f823-7456-4494-85c7-d29fcf35e7b5" (UID: "9ef9f823-7456-4494-85c7-d29fcf35e7b5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.984829 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-config-data" (OuterVolumeSpecName: "config-data") pod "9ef9f823-7456-4494-85c7-d29fcf35e7b5" (UID: "9ef9f823-7456-4494-85c7-d29fcf35e7b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:49:13 crc kubenswrapper[4781]: I1202 09:49:13.992024 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "9ef9f823-7456-4494-85c7-d29fcf35e7b5" (UID: "9ef9f823-7456-4494-85c7-d29fcf35e7b5"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.052988 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.053027 4781 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ef9f823-7456-4494-85c7-d29fcf35e7b5-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.053039 4781 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.053052 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.053064 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.053075 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.053086 4781 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ef9f823-7456-4494-85c7-d29fcf35e7b5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.053096 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.053107 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkktl\" (UniqueName: \"kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-kube-api-access-qkktl\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.055149 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-server-conf" (OuterVolumeSpecName: "server-conf") pod "9ef9f823-7456-4494-85c7-d29fcf35e7b5" (UID: "9ef9f823-7456-4494-85c7-d29fcf35e7b5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.060541 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-fwkmh"] Dec 02 09:49:14 crc kubenswrapper[4781]: E1202 09:49:14.069529 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef9f823-7456-4494-85c7-d29fcf35e7b5" containerName="rabbitmq" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.069586 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef9f823-7456-4494-85c7-d29fcf35e7b5" containerName="rabbitmq" Dec 02 09:49:14 crc kubenswrapper[4781]: E1202 09:49:14.069635 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef9f823-7456-4494-85c7-d29fcf35e7b5" containerName="setup-container" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.069646 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef9f823-7456-4494-85c7-d29fcf35e7b5" containerName="setup-container" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.070117 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef9f823-7456-4494-85c7-d29fcf35e7b5" containerName="rabbitmq" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.076779 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.089719 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.118579 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-fwkmh"] Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.156487 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.156567 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.156606 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.156704 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.156784 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-config\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.156863 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2fbq\" (UniqueName: \"kubernetes.io/projected/220427cb-65a7-4603-b174-a54fe9bf7ef1-kube-api-access-r2fbq\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.156916 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.157049 4781 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9ef9f823-7456-4494-85c7-d29fcf35e7b5-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.167743 4781 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.196307 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9ef9f823-7456-4494-85c7-d29fcf35e7b5" (UID: "9ef9f823-7456-4494-85c7-d29fcf35e7b5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.216216 4781 generic.go:334] "Generic (PLEG): container finished" podID="9ef9f823-7456-4494-85c7-d29fcf35e7b5" containerID="8859e4a4c2bfa9375f419be864c8f71d03a958b1c01be72f1c1bf549a3c73cec" exitCode=0 Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.216263 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9ef9f823-7456-4494-85c7-d29fcf35e7b5","Type":"ContainerDied","Data":"8859e4a4c2bfa9375f419be864c8f71d03a958b1c01be72f1c1bf549a3c73cec"} Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.216294 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9ef9f823-7456-4494-85c7-d29fcf35e7b5","Type":"ContainerDied","Data":"5adf3a424f5a9a7053218f386b18c9130d5bcd2f34ed07f4b5c637298b1a2ab0"} Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.216317 4781 scope.go:117] "RemoveContainer" containerID="8859e4a4c2bfa9375f419be864c8f71d03a958b1c01be72f1c1bf549a3c73cec" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.216510 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.260153 4781 scope.go:117] "RemoveContainer" containerID="cd33af3c334721c99ee5f6eebb56a79b04c6781e3be5fde4010c62024eb59205" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.260865 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.262110 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.262328 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-config\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.262471 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2fbq\" (UniqueName: \"kubernetes.io/projected/220427cb-65a7-4603-b174-a54fe9bf7ef1-kube-api-access-r2fbq\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.262543 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.262689 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.262748 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.262776 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.262877 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ef9f823-7456-4494-85c7-d29fcf35e7b5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.262890 4781 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.263834 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.264656 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-config\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.265052 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.265125 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.265615 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.274116 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.286998 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.287384 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2fbq\" (UniqueName: \"kubernetes.io/projected/220427cb-65a7-4603-b174-a54fe9bf7ef1-kube-api-access-r2fbq\") pod \"dnsmasq-dns-79bd4cc8c9-fwkmh\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.295601 4781 scope.go:117] "RemoveContainer" containerID="8859e4a4c2bfa9375f419be864c8f71d03a958b1c01be72f1c1bf549a3c73cec" Dec 02 09:49:14 crc kubenswrapper[4781]: E1202 09:49:14.296270 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8859e4a4c2bfa9375f419be864c8f71d03a958b1c01be72f1c1bf549a3c73cec\": container with ID starting with 8859e4a4c2bfa9375f419be864c8f71d03a958b1c01be72f1c1bf549a3c73cec not found: ID does not exist" containerID="8859e4a4c2bfa9375f419be864c8f71d03a958b1c01be72f1c1bf549a3c73cec" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.296333 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8859e4a4c2bfa9375f419be864c8f71d03a958b1c01be72f1c1bf549a3c73cec"} err="failed to get container status \"8859e4a4c2bfa9375f419be864c8f71d03a958b1c01be72f1c1bf549a3c73cec\": rpc error: code = NotFound desc = could not find container \"8859e4a4c2bfa9375f419be864c8f71d03a958b1c01be72f1c1bf549a3c73cec\": container with ID starting with 8859e4a4c2bfa9375f419be864c8f71d03a958b1c01be72f1c1bf549a3c73cec not found: ID does not exist" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.296365 4781 scope.go:117] "RemoveContainer" containerID="cd33af3c334721c99ee5f6eebb56a79b04c6781e3be5fde4010c62024eb59205" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.296941 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 09:49:14 crc kubenswrapper[4781]: E1202 09:49:14.297372 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd33af3c334721c99ee5f6eebb56a79b04c6781e3be5fde4010c62024eb59205\": container with ID starting with cd33af3c334721c99ee5f6eebb56a79b04c6781e3be5fde4010c62024eb59205 not found: ID does not exist" containerID="cd33af3c334721c99ee5f6eebb56a79b04c6781e3be5fde4010c62024eb59205" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.297403 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd33af3c334721c99ee5f6eebb56a79b04c6781e3be5fde4010c62024eb59205"} err="failed to get container status \"cd33af3c334721c99ee5f6eebb56a79b04c6781e3be5fde4010c62024eb59205\": rpc error: code = NotFound desc = could not find container \"cd33af3c334721c99ee5f6eebb56a79b04c6781e3be5fde4010c62024eb59205\": container with ID starting with cd33af3c334721c99ee5f6eebb56a79b04c6781e3be5fde4010c62024eb59205 not found: ID does not exist" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.298971 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.302748 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.302918 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.303086 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.303094 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.303143 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.306138 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.306144 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6mkz6" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.313242 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.364357 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c72516d-29b4-4932-8a47-8838d686b176-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.364474 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c72516d-29b4-4932-8a47-8838d686b176-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.364498 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c72516d-29b4-4932-8a47-8838d686b176-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.364524 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c72516d-29b4-4932-8a47-8838d686b176-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.364796 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c72516d-29b4-4932-8a47-8838d686b176-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.364900 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c72516d-29b4-4932-8a47-8838d686b176-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.364935 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c72516d-29b4-4932-8a47-8838d686b176-config-data\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.364960 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c72516d-29b4-4932-8a47-8838d686b176-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.365014 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c72516d-29b4-4932-8a47-8838d686b176-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.365080 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.365120 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hntn7\" (UniqueName: \"kubernetes.io/projected/4c72516d-29b4-4932-8a47-8838d686b176-kube-api-access-hntn7\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.422397 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.466917 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c72516d-29b4-4932-8a47-8838d686b176-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.467025 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c72516d-29b4-4932-8a47-8838d686b176-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.467054 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c72516d-29b4-4932-8a47-8838d686b176-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.467082 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c72516d-29b4-4932-8a47-8838d686b176-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.467135 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c72516d-29b4-4932-8a47-8838d686b176-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.467178 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c72516d-29b4-4932-8a47-8838d686b176-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.467199 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c72516d-29b4-4932-8a47-8838d686b176-config-data\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.467224 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c72516d-29b4-4932-8a47-8838d686b176-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.467255 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c72516d-29b4-4932-8a47-8838d686b176-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.467294 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.467323 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hntn7\" (UniqueName: \"kubernetes.io/projected/4c72516d-29b4-4932-8a47-8838d686b176-kube-api-access-hntn7\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.467648 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.467842 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c72516d-29b4-4932-8a47-8838d686b176-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.468516 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c72516d-29b4-4932-8a47-8838d686b176-config-data\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.469323 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c72516d-29b4-4932-8a47-8838d686b176-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.471665 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c72516d-29b4-4932-8a47-8838d686b176-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.472979 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c72516d-29b4-4932-8a47-8838d686b176-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.473076 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c72516d-29b4-4932-8a47-8838d686b176-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.473287 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c72516d-29b4-4932-8a47-8838d686b176-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.473437 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c72516d-29b4-4932-8a47-8838d686b176-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.491439 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hntn7\" (UniqueName: \"kubernetes.io/projected/4c72516d-29b4-4932-8a47-8838d686b176-kube-api-access-hntn7\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.491814 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c72516d-29b4-4932-8a47-8838d686b176-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.517427 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"4c72516d-29b4-4932-8a47-8838d686b176\") " pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.624468 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.836099 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.874703 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-plugins-conf\") pod \"4d040259-d968-45a1-832a-45586a9fe0d1\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.874740 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4d040259-d968-45a1-832a-45586a9fe0d1-pod-info\") pod \"4d040259-d968-45a1-832a-45586a9fe0d1\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.874845 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-tls\") pod \"4d040259-d968-45a1-832a-45586a9fe0d1\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.874886 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-erlang-cookie\") pod \"4d040259-d968-45a1-832a-45586a9fe0d1\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.874910 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"4d040259-d968-45a1-832a-45586a9fe0d1\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.874968 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-config-data\") pod \"4d040259-d968-45a1-832a-45586a9fe0d1\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.875014 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-server-conf\") pod \"4d040259-d968-45a1-832a-45586a9fe0d1\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.875074 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4d040259-d968-45a1-832a-45586a9fe0d1-erlang-cookie-secret\") pod \"4d040259-d968-45a1-832a-45586a9fe0d1\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.875111 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-confd\") pod \"4d040259-d968-45a1-832a-45586a9fe0d1\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.875146 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-plugins\") pod \"4d040259-d968-45a1-832a-45586a9fe0d1\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.875252 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ndjp\" (UniqueName: \"kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-kube-api-access-7ndjp\") pod \"4d040259-d968-45a1-832a-45586a9fe0d1\" (UID: \"4d040259-d968-45a1-832a-45586a9fe0d1\") " Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.879153 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4d040259-d968-45a1-832a-45586a9fe0d1" (UID: "4d040259-d968-45a1-832a-45586a9fe0d1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.881123 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4d040259-d968-45a1-832a-45586a9fe0d1" (UID: "4d040259-d968-45a1-832a-45586a9fe0d1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.881320 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "4d040259-d968-45a1-832a-45586a9fe0d1" (UID: "4d040259-d968-45a1-832a-45586a9fe0d1"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.881391 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4d040259-d968-45a1-832a-45586a9fe0d1" (UID: "4d040259-d968-45a1-832a-45586a9fe0d1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.881544 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4d040259-d968-45a1-832a-45586a9fe0d1" (UID: "4d040259-d968-45a1-832a-45586a9fe0d1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.882329 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4d040259-d968-45a1-832a-45586a9fe0d1-pod-info" (OuterVolumeSpecName: "pod-info") pod "4d040259-d968-45a1-832a-45586a9fe0d1" (UID: "4d040259-d968-45a1-832a-45586a9fe0d1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.883413 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-kube-api-access-7ndjp" (OuterVolumeSpecName: "kube-api-access-7ndjp") pod "4d040259-d968-45a1-832a-45586a9fe0d1" (UID: "4d040259-d968-45a1-832a-45586a9fe0d1"). InnerVolumeSpecName "kube-api-access-7ndjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.884419 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d040259-d968-45a1-832a-45586a9fe0d1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4d040259-d968-45a1-832a-45586a9fe0d1" (UID: "4d040259-d968-45a1-832a-45586a9fe0d1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.910054 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-config-data" (OuterVolumeSpecName: "config-data") pod "4d040259-d968-45a1-832a-45586a9fe0d1" (UID: "4d040259-d968-45a1-832a-45586a9fe0d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.967904 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-server-conf" (OuterVolumeSpecName: "server-conf") pod "4d040259-d968-45a1-832a-45586a9fe0d1" (UID: "4d040259-d968-45a1-832a-45586a9fe0d1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.978564 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.978600 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.979313 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.979325 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.979335 4781 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.979345 4781 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4d040259-d968-45a1-832a-45586a9fe0d1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.979357 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.979366 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ndjp\" (UniqueName: \"kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-kube-api-access-7ndjp\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.979374 4781 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4d040259-d968-45a1-832a-45586a9fe0d1-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:14 crc kubenswrapper[4781]: I1202 09:49:14.979382 4781 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4d040259-d968-45a1-832a-45586a9fe0d1-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.003159 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4d040259-d968-45a1-832a-45586a9fe0d1" (UID: "4d040259-d968-45a1-832a-45586a9fe0d1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.004709 4781 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.052819 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-fwkmh"] Dec 02 09:49:15 crc kubenswrapper[4781]: W1202 09:49:15.055780 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod220427cb_65a7_4603_b174_a54fe9bf7ef1.slice/crio-58bb4f64da50c7753d43c40c470ba6f810bd166000954833a21c1c8b3bab1ec9 WatchSource:0}: Error finding container 58bb4f64da50c7753d43c40c470ba6f810bd166000954833a21c1c8b3bab1ec9: Status 404 returned error can't find the container with id 58bb4f64da50c7753d43c40c470ba6f810bd166000954833a21c1c8b3bab1ec9 Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.081082 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4d040259-d968-45a1-832a-45586a9fe0d1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.081110 4781 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.236770 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.238224 4781 generic.go:334] "Generic (PLEG): container finished" podID="4d040259-d968-45a1-832a-45586a9fe0d1" containerID="29d9e8eaac33a1eec6f376ec46ba525e7cb820604a4b8f6d204d2bf0ffeea97d" exitCode=0 Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.238328 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.238444 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4d040259-d968-45a1-832a-45586a9fe0d1","Type":"ContainerDied","Data":"29d9e8eaac33a1eec6f376ec46ba525e7cb820604a4b8f6d204d2bf0ffeea97d"} Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.238521 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4d040259-d968-45a1-832a-45586a9fe0d1","Type":"ContainerDied","Data":"32e4c1f22b9bb8c4ef6ee1fcd7631f1c7f54de11afbca44eac23fc817341db6d"} Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.238544 4781 scope.go:117] "RemoveContainer" containerID="29d9e8eaac33a1eec6f376ec46ba525e7cb820604a4b8f6d204d2bf0ffeea97d" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.249600 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" event={"ID":"220427cb-65a7-4603-b174-a54fe9bf7ef1","Type":"ContainerStarted","Data":"58bb4f64da50c7753d43c40c470ba6f810bd166000954833a21c1c8b3bab1ec9"} Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.296429 4781 scope.go:117] "RemoveContainer" containerID="b659230fc0458ff9201681c08c775d787b2461d03f85fed77a3b501a5314c78e" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.314859 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.336820 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.366164 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 09:49:15 crc kubenswrapper[4781]: E1202 09:49:15.366793 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d040259-d968-45a1-832a-45586a9fe0d1" containerName="rabbitmq" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.366817 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d040259-d968-45a1-832a-45586a9fe0d1" containerName="rabbitmq" Dec 02 09:49:15 crc kubenswrapper[4781]: E1202 09:49:15.366835 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d040259-d968-45a1-832a-45586a9fe0d1" containerName="setup-container" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.366843 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d040259-d968-45a1-832a-45586a9fe0d1" containerName="setup-container" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.367106 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d040259-d968-45a1-832a-45586a9fe0d1" containerName="rabbitmq" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.368362 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.371010 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.371172 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.371262 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pk9hs" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.371189 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.372661 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.380084 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.382447 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.391203 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.424202 4781 scope.go:117] "RemoveContainer" containerID="29d9e8eaac33a1eec6f376ec46ba525e7cb820604a4b8f6d204d2bf0ffeea97d" Dec 02 09:49:15 crc kubenswrapper[4781]: E1202 09:49:15.424752 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29d9e8eaac33a1eec6f376ec46ba525e7cb820604a4b8f6d204d2bf0ffeea97d\": container with ID starting with 29d9e8eaac33a1eec6f376ec46ba525e7cb820604a4b8f6d204d2bf0ffeea97d not found: ID does not exist" containerID="29d9e8eaac33a1eec6f376ec46ba525e7cb820604a4b8f6d204d2bf0ffeea97d" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.424804 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29d9e8eaac33a1eec6f376ec46ba525e7cb820604a4b8f6d204d2bf0ffeea97d"} err="failed to get container status \"29d9e8eaac33a1eec6f376ec46ba525e7cb820604a4b8f6d204d2bf0ffeea97d\": rpc error: code = NotFound desc = could not find container \"29d9e8eaac33a1eec6f376ec46ba525e7cb820604a4b8f6d204d2bf0ffeea97d\": container with ID starting with 29d9e8eaac33a1eec6f376ec46ba525e7cb820604a4b8f6d204d2bf0ffeea97d not found: ID does not exist" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.424838 4781 scope.go:117] "RemoveContainer" containerID="b659230fc0458ff9201681c08c775d787b2461d03f85fed77a3b501a5314c78e" Dec 02 09:49:15 crc kubenswrapper[4781]: E1202 09:49:15.425288 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b659230fc0458ff9201681c08c775d787b2461d03f85fed77a3b501a5314c78e\": container with ID starting with b659230fc0458ff9201681c08c775d787b2461d03f85fed77a3b501a5314c78e not found: ID does not exist" containerID="b659230fc0458ff9201681c08c775d787b2461d03f85fed77a3b501a5314c78e" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.425314 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b659230fc0458ff9201681c08c775d787b2461d03f85fed77a3b501a5314c78e"} err="failed to get container status \"b659230fc0458ff9201681c08c775d787b2461d03f85fed77a3b501a5314c78e\": rpc error: code = NotFound desc = could not find container \"b659230fc0458ff9201681c08c775d787b2461d03f85fed77a3b501a5314c78e\": container with ID starting with b659230fc0458ff9201681c08c775d787b2461d03f85fed77a3b501a5314c78e not found: ID does not exist" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.490242 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f7f1b54-7b32-4063-af85-f97785416d26-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.490299 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f7f1b54-7b32-4063-af85-f97785416d26-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.490367 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f7f1b54-7b32-4063-af85-f97785416d26-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.490407 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f7f1b54-7b32-4063-af85-f97785416d26-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.490470 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f7f1b54-7b32-4063-af85-f97785416d26-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.490532 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjd4b\" (UniqueName: \"kubernetes.io/projected/6f7f1b54-7b32-4063-af85-f97785416d26-kube-api-access-mjd4b\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.490579 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.490613 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f7f1b54-7b32-4063-af85-f97785416d26-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.490686 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f7f1b54-7b32-4063-af85-f97785416d26-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.490727 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f7f1b54-7b32-4063-af85-f97785416d26-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.490771 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f7f1b54-7b32-4063-af85-f97785416d26-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.519308 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d040259-d968-45a1-832a-45586a9fe0d1" path="/var/lib/kubelet/pods/4d040259-d968-45a1-832a-45586a9fe0d1/volumes" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.520387 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef9f823-7456-4494-85c7-d29fcf35e7b5" path="/var/lib/kubelet/pods/9ef9f823-7456-4494-85c7-d29fcf35e7b5/volumes" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.592284 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f7f1b54-7b32-4063-af85-f97785416d26-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.592342 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjd4b\" (UniqueName: \"kubernetes.io/projected/6f7f1b54-7b32-4063-af85-f97785416d26-kube-api-access-mjd4b\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.592373 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.592406 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f7f1b54-7b32-4063-af85-f97785416d26-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.592448 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f7f1b54-7b32-4063-af85-f97785416d26-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.592484 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f7f1b54-7b32-4063-af85-f97785416d26-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.592521 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f7f1b54-7b32-4063-af85-f97785416d26-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.592581 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f7f1b54-7b32-4063-af85-f97785416d26-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.592611 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f7f1b54-7b32-4063-af85-f97785416d26-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.592629 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f7f1b54-7b32-4063-af85-f97785416d26-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.592656 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f7f1b54-7b32-4063-af85-f97785416d26-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.592757 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.592960 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f7f1b54-7b32-4063-af85-f97785416d26-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.593092 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f7f1b54-7b32-4063-af85-f97785416d26-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.593855 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f7f1b54-7b32-4063-af85-f97785416d26-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.593916 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f7f1b54-7b32-4063-af85-f97785416d26-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.594730 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f7f1b54-7b32-4063-af85-f97785416d26-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.598503 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f7f1b54-7b32-4063-af85-f97785416d26-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.599206 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f7f1b54-7b32-4063-af85-f97785416d26-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.602785 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f7f1b54-7b32-4063-af85-f97785416d26-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.608471 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f7f1b54-7b32-4063-af85-f97785416d26-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.615103 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjd4b\" (UniqueName: \"kubernetes.io/projected/6f7f1b54-7b32-4063-af85-f97785416d26-kube-api-access-mjd4b\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.632889 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f7f1b54-7b32-4063-af85-f97785416d26\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:15 crc kubenswrapper[4781]: I1202 09:49:15.870243 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:16 crc kubenswrapper[4781]: I1202 09:49:16.263751 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4c72516d-29b4-4932-8a47-8838d686b176","Type":"ContainerStarted","Data":"253b87df4cbe60a800ab8188f8ba033597fed8a80cfc66bc9f8f9ae95b171d67"} Dec 02 09:49:16 crc kubenswrapper[4781]: I1202 09:49:16.265773 4781 generic.go:334] "Generic (PLEG): container finished" podID="220427cb-65a7-4603-b174-a54fe9bf7ef1" containerID="27b759716a566c72e2d3aab50a76f8c713b91b274817386d1d3596d6c756bf27" exitCode=0 Dec 02 09:49:16 crc kubenswrapper[4781]: I1202 09:49:16.265804 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" event={"ID":"220427cb-65a7-4603-b174-a54fe9bf7ef1","Type":"ContainerDied","Data":"27b759716a566c72e2d3aab50a76f8c713b91b274817386d1d3596d6c756bf27"} Dec 02 09:49:16 crc kubenswrapper[4781]: I1202 09:49:16.396114 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 09:49:16 crc kubenswrapper[4781]: W1202 09:49:16.402528 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f7f1b54_7b32_4063_af85_f97785416d26.slice/crio-f99fcc2de130facc90924184f38919475d59d39d497aa48bd9505d61ab397cbf WatchSource:0}: Error finding container f99fcc2de130facc90924184f38919475d59d39d497aa48bd9505d61ab397cbf: Status 404 returned error can't find the container with id f99fcc2de130facc90924184f38919475d59d39d497aa48bd9505d61ab397cbf Dec 02 09:49:17 crc kubenswrapper[4781]: I1202 09:49:17.276563 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" event={"ID":"220427cb-65a7-4603-b174-a54fe9bf7ef1","Type":"ContainerStarted","Data":"c9f36c461051646eae90f0b5928d45331fb122f3444f8b65b08d939541b78cd9"} Dec 02 09:49:17 crc kubenswrapper[4781]: I1202 09:49:17.277815 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:17 crc kubenswrapper[4781]: I1202 09:49:17.277906 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f7f1b54-7b32-4063-af85-f97785416d26","Type":"ContainerStarted","Data":"f99fcc2de130facc90924184f38919475d59d39d497aa48bd9505d61ab397cbf"} Dec 02 09:49:17 crc kubenswrapper[4781]: I1202 09:49:17.279978 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4c72516d-29b4-4932-8a47-8838d686b176","Type":"ContainerStarted","Data":"e62566f8ba5f5a2d2dfb9fcb9a087ffa140dbf5d9b98cc27756464b4739453e5"} Dec 02 09:49:17 crc kubenswrapper[4781]: I1202 09:49:17.300482 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" podStartSLOduration=3.300466857 podStartE2EDuration="3.300466857s" podCreationTimestamp="2025-12-02 09:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:49:17.293631372 +0000 UTC m=+1720.117505251" watchObservedRunningTime="2025-12-02 09:49:17.300466857 +0000 UTC m=+1720.124340736" Dec 02 09:49:17 crc kubenswrapper[4781]: I1202 09:49:17.510793 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:49:17 crc kubenswrapper[4781]: E1202 09:49:17.511155 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:49:18 crc kubenswrapper[4781]: I1202 09:49:18.290827 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f7f1b54-7b32-4063-af85-f97785416d26","Type":"ContainerStarted","Data":"3dacb59056c74de632cc4473faacba4b83a8b0d044b9d8c6d37564f4ed4ad6ff"} Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.423981 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.490615 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-vhxb5"] Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.490952 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" podUID="27151ccc-51de-40b5-8145-6ca697fcf788" containerName="dnsmasq-dns" containerID="cri-o://66336063b7e5a589db076d665175e4e0e57d3b3e1913275c7823af85b847be70" gracePeriod=10 Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.701916 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-x5657"] Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.705065 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.722762 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-x5657"] Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.817191 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-dns-svc\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.817325 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw7h6\" (UniqueName: \"kubernetes.io/projected/6d69f627-f361-4060-91c8-dc5763a00f16-kube-api-access-kw7h6\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.817380 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.817412 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.817445 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-config\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.817540 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.817557 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.920821 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.920866 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.920903 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-dns-svc\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.920971 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw7h6\" (UniqueName: \"kubernetes.io/projected/6d69f627-f361-4060-91c8-dc5763a00f16-kube-api-access-kw7h6\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.921040 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.921067 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.921104 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-config\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.921983 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.922086 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.923746 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-dns-svc\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.925853 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-config\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.927609 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.927675 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d69f627-f361-4060-91c8-dc5763a00f16-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:24 crc kubenswrapper[4781]: I1202 09:49:24.942791 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw7h6\" (UniqueName: \"kubernetes.io/projected/6d69f627-f361-4060-91c8-dc5763a00f16-kube-api-access-kw7h6\") pod \"dnsmasq-dns-55478c4467-x5657\" (UID: \"6d69f627-f361-4060-91c8-dc5763a00f16\") " pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.034115 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.141029 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.328655 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-dns-swift-storage-0\") pod \"27151ccc-51de-40b5-8145-6ca697fcf788\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.329087 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4wsr\" (UniqueName: \"kubernetes.io/projected/27151ccc-51de-40b5-8145-6ca697fcf788-kube-api-access-r4wsr\") pod \"27151ccc-51de-40b5-8145-6ca697fcf788\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.329165 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-dns-svc\") pod \"27151ccc-51de-40b5-8145-6ca697fcf788\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.329251 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-ovsdbserver-sb\") pod \"27151ccc-51de-40b5-8145-6ca697fcf788\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.329330 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-config\") pod \"27151ccc-51de-40b5-8145-6ca697fcf788\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.329396 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-ovsdbserver-nb\") pod \"27151ccc-51de-40b5-8145-6ca697fcf788\" (UID: \"27151ccc-51de-40b5-8145-6ca697fcf788\") " Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.333969 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27151ccc-51de-40b5-8145-6ca697fcf788-kube-api-access-r4wsr" (OuterVolumeSpecName: "kube-api-access-r4wsr") pod "27151ccc-51de-40b5-8145-6ca697fcf788" (UID: "27151ccc-51de-40b5-8145-6ca697fcf788"). InnerVolumeSpecName "kube-api-access-r4wsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.361981 4781 generic.go:334] "Generic (PLEG): container finished" podID="27151ccc-51de-40b5-8145-6ca697fcf788" containerID="66336063b7e5a589db076d665175e4e0e57d3b3e1913275c7823af85b847be70" exitCode=0 Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.362032 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" event={"ID":"27151ccc-51de-40b5-8145-6ca697fcf788","Type":"ContainerDied","Data":"66336063b7e5a589db076d665175e4e0e57d3b3e1913275c7823af85b847be70"} Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.362066 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" event={"ID":"27151ccc-51de-40b5-8145-6ca697fcf788","Type":"ContainerDied","Data":"87000a59626b413375965c3170d0fa7556ef16d2fc4fa0aee0f8faab9af53c8e"} Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.362089 4781 scope.go:117] "RemoveContainer" containerID="66336063b7e5a589db076d665175e4e0e57d3b3e1913275c7823af85b847be70" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.362270 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-vhxb5" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.385212 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "27151ccc-51de-40b5-8145-6ca697fcf788" (UID: "27151ccc-51de-40b5-8145-6ca697fcf788"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.386345 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-config" (OuterVolumeSpecName: "config") pod "27151ccc-51de-40b5-8145-6ca697fcf788" (UID: "27151ccc-51de-40b5-8145-6ca697fcf788"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.390292 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "27151ccc-51de-40b5-8145-6ca697fcf788" (UID: "27151ccc-51de-40b5-8145-6ca697fcf788"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.392513 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "27151ccc-51de-40b5-8145-6ca697fcf788" (UID: "27151ccc-51de-40b5-8145-6ca697fcf788"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.394947 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "27151ccc-51de-40b5-8145-6ca697fcf788" (UID: "27151ccc-51de-40b5-8145-6ca697fcf788"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.405553 4781 scope.go:117] "RemoveContainer" containerID="2af5994edb2020cfc3a4dceb1e1abe78aef8c6985c1a0c038a0a3253ba6aff49" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.431473 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.431531 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4wsr\" (UniqueName: \"kubernetes.io/projected/27151ccc-51de-40b5-8145-6ca697fcf788-kube-api-access-r4wsr\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.431547 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.431559 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.431571 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.431582 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27151ccc-51de-40b5-8145-6ca697fcf788-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.444325 4781 scope.go:117] "RemoveContainer" containerID="66336063b7e5a589db076d665175e4e0e57d3b3e1913275c7823af85b847be70" Dec 02 09:49:25 crc kubenswrapper[4781]: E1202 09:49:25.444946 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66336063b7e5a589db076d665175e4e0e57d3b3e1913275c7823af85b847be70\": container with ID starting with 66336063b7e5a589db076d665175e4e0e57d3b3e1913275c7823af85b847be70 not found: ID does not exist" containerID="66336063b7e5a589db076d665175e4e0e57d3b3e1913275c7823af85b847be70" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.444989 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66336063b7e5a589db076d665175e4e0e57d3b3e1913275c7823af85b847be70"} err="failed to get container status \"66336063b7e5a589db076d665175e4e0e57d3b3e1913275c7823af85b847be70\": rpc error: code = NotFound desc = could not find container \"66336063b7e5a589db076d665175e4e0e57d3b3e1913275c7823af85b847be70\": container with ID starting with 66336063b7e5a589db076d665175e4e0e57d3b3e1913275c7823af85b847be70 not found: ID does not exist" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.445017 4781 scope.go:117] "RemoveContainer" containerID="2af5994edb2020cfc3a4dceb1e1abe78aef8c6985c1a0c038a0a3253ba6aff49" Dec 02 09:49:25 crc kubenswrapper[4781]: E1202 09:49:25.445555 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2af5994edb2020cfc3a4dceb1e1abe78aef8c6985c1a0c038a0a3253ba6aff49\": container with ID starting with 2af5994edb2020cfc3a4dceb1e1abe78aef8c6985c1a0c038a0a3253ba6aff49 not found: ID does not exist" containerID="2af5994edb2020cfc3a4dceb1e1abe78aef8c6985c1a0c038a0a3253ba6aff49" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.445597 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2af5994edb2020cfc3a4dceb1e1abe78aef8c6985c1a0c038a0a3253ba6aff49"} err="failed to get container status \"2af5994edb2020cfc3a4dceb1e1abe78aef8c6985c1a0c038a0a3253ba6aff49\": rpc error: code = NotFound desc = could not find container \"2af5994edb2020cfc3a4dceb1e1abe78aef8c6985c1a0c038a0a3253ba6aff49\": container with ID starting with 2af5994edb2020cfc3a4dceb1e1abe78aef8c6985c1a0c038a0a3253ba6aff49 not found: ID does not exist" Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.539051 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-x5657"] Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.780279 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-vhxb5"] Dec 02 09:49:25 crc kubenswrapper[4781]: I1202 09:49:25.792248 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-vhxb5"] Dec 02 09:49:26 crc kubenswrapper[4781]: I1202 09:49:26.373226 4781 generic.go:334] "Generic (PLEG): container finished" podID="6d69f627-f361-4060-91c8-dc5763a00f16" containerID="a76b0122755629c24770f79a6726d345a9aa95677071fd2a665f17da2db1a083" exitCode=0 Dec 02 09:49:26 crc kubenswrapper[4781]: I1202 09:49:26.373280 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-x5657" event={"ID":"6d69f627-f361-4060-91c8-dc5763a00f16","Type":"ContainerDied","Data":"a76b0122755629c24770f79a6726d345a9aa95677071fd2a665f17da2db1a083"} Dec 02 09:49:26 crc kubenswrapper[4781]: I1202 09:49:26.373501 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-x5657" event={"ID":"6d69f627-f361-4060-91c8-dc5763a00f16","Type":"ContainerStarted","Data":"7e03f38df2db0556f269c71a3c7b4259f69e17310d5c5403cc0a88fb70bd94d7"} Dec 02 09:49:27 crc kubenswrapper[4781]: I1202 09:49:27.386496 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-x5657" event={"ID":"6d69f627-f361-4060-91c8-dc5763a00f16","Type":"ContainerStarted","Data":"0bccc90e76bb3805fc052a409e8b52591d168b9f9e9889af2792b6f9c1ca0235"} Dec 02 09:49:27 crc kubenswrapper[4781]: I1202 09:49:27.387827 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:27 crc kubenswrapper[4781]: I1202 09:49:27.416242 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-x5657" podStartSLOduration=3.4162208339999998 podStartE2EDuration="3.416220834s" podCreationTimestamp="2025-12-02 09:49:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:49:27.408243419 +0000 UTC m=+1730.232117298" watchObservedRunningTime="2025-12-02 09:49:27.416220834 +0000 UTC m=+1730.240094713" Dec 02 09:49:27 crc kubenswrapper[4781]: I1202 09:49:27.514633 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27151ccc-51de-40b5-8145-6ca697fcf788" path="/var/lib/kubelet/pods/27151ccc-51de-40b5-8145-6ca697fcf788/volumes" Dec 02 09:49:31 crc kubenswrapper[4781]: I1202 09:49:31.500767 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:49:31 crc kubenswrapper[4781]: E1202 09:49:31.501602 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.036176 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-x5657" Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.091705 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-fwkmh"] Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.093049 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" podUID="220427cb-65a7-4603-b174-a54fe9bf7ef1" containerName="dnsmasq-dns" containerID="cri-o://c9f36c461051646eae90f0b5928d45331fb122f3444f8b65b08d939541b78cd9" gracePeriod=10 Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.489598 4781 generic.go:334] "Generic (PLEG): container finished" podID="220427cb-65a7-4603-b174-a54fe9bf7ef1" containerID="c9f36c461051646eae90f0b5928d45331fb122f3444f8b65b08d939541b78cd9" exitCode=0 Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.489697 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" event={"ID":"220427cb-65a7-4603-b174-a54fe9bf7ef1","Type":"ContainerDied","Data":"c9f36c461051646eae90f0b5928d45331fb122f3444f8b65b08d939541b78cd9"} Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.636502 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.743431 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-ovsdbserver-sb\") pod \"220427cb-65a7-4603-b174-a54fe9bf7ef1\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.743528 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-ovsdbserver-nb\") pod \"220427cb-65a7-4603-b174-a54fe9bf7ef1\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.743567 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-openstack-edpm-ipam\") pod \"220427cb-65a7-4603-b174-a54fe9bf7ef1\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.743718 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-config\") pod \"220427cb-65a7-4603-b174-a54fe9bf7ef1\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.743749 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-dns-svc\") pod \"220427cb-65a7-4603-b174-a54fe9bf7ef1\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.743771 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-dns-swift-storage-0\") pod \"220427cb-65a7-4603-b174-a54fe9bf7ef1\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.743860 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2fbq\" (UniqueName: \"kubernetes.io/projected/220427cb-65a7-4603-b174-a54fe9bf7ef1-kube-api-access-r2fbq\") pod \"220427cb-65a7-4603-b174-a54fe9bf7ef1\" (UID: \"220427cb-65a7-4603-b174-a54fe9bf7ef1\") " Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.752519 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/220427cb-65a7-4603-b174-a54fe9bf7ef1-kube-api-access-r2fbq" (OuterVolumeSpecName: "kube-api-access-r2fbq") pod "220427cb-65a7-4603-b174-a54fe9bf7ef1" (UID: "220427cb-65a7-4603-b174-a54fe9bf7ef1"). InnerVolumeSpecName "kube-api-access-r2fbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.796363 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "220427cb-65a7-4603-b174-a54fe9bf7ef1" (UID: "220427cb-65a7-4603-b174-a54fe9bf7ef1"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.797846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "220427cb-65a7-4603-b174-a54fe9bf7ef1" (UID: "220427cb-65a7-4603-b174-a54fe9bf7ef1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.798020 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "220427cb-65a7-4603-b174-a54fe9bf7ef1" (UID: "220427cb-65a7-4603-b174-a54fe9bf7ef1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.799595 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "220427cb-65a7-4603-b174-a54fe9bf7ef1" (UID: "220427cb-65a7-4603-b174-a54fe9bf7ef1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.803426 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-config" (OuterVolumeSpecName: "config") pod "220427cb-65a7-4603-b174-a54fe9bf7ef1" (UID: "220427cb-65a7-4603-b174-a54fe9bf7ef1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.805532 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "220427cb-65a7-4603-b174-a54fe9bf7ef1" (UID: "220427cb-65a7-4603-b174-a54fe9bf7ef1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.846040 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2fbq\" (UniqueName: \"kubernetes.io/projected/220427cb-65a7-4603-b174-a54fe9bf7ef1-kube-api-access-r2fbq\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.846082 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.846094 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.846103 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.846112 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.846121 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:35 crc kubenswrapper[4781]: I1202 09:49:35.846129 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/220427cb-65a7-4603-b174-a54fe9bf7ef1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:36 crc kubenswrapper[4781]: I1202 09:49:36.502547 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" event={"ID":"220427cb-65a7-4603-b174-a54fe9bf7ef1","Type":"ContainerDied","Data":"58bb4f64da50c7753d43c40c470ba6f810bd166000954833a21c1c8b3bab1ec9"} Dec 02 09:49:36 crc kubenswrapper[4781]: I1202 09:49:36.502601 4781 scope.go:117] "RemoveContainer" containerID="c9f36c461051646eae90f0b5928d45331fb122f3444f8b65b08d939541b78cd9" Dec 02 09:49:36 crc kubenswrapper[4781]: I1202 09:49:36.502631 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-fwkmh" Dec 02 09:49:36 crc kubenswrapper[4781]: I1202 09:49:36.549091 4781 scope.go:117] "RemoveContainer" containerID="27b759716a566c72e2d3aab50a76f8c713b91b274817386d1d3596d6c756bf27" Dec 02 09:49:36 crc kubenswrapper[4781]: I1202 09:49:36.553328 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-fwkmh"] Dec 02 09:49:36 crc kubenswrapper[4781]: I1202 09:49:36.561807 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-fwkmh"] Dec 02 09:49:37 crc kubenswrapper[4781]: I1202 09:49:37.521291 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="220427cb-65a7-4603-b174-a54fe9bf7ef1" path="/var/lib/kubelet/pods/220427cb-65a7-4603-b174-a54fe9bf7ef1/volumes" Dec 02 09:49:42 crc kubenswrapper[4781]: I1202 09:49:42.281976 4781 scope.go:117] "RemoveContainer" containerID="210fc2f90a1c188d5fb9db839b1dec0677c6719e34d607855d194414190ddb9c" Dec 02 09:49:44 crc kubenswrapper[4781]: I1202 09:49:44.500537 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:49:44 crc kubenswrapper[4781]: E1202 09:49:44.501180 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.113608 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk"] Dec 02 09:49:48 crc kubenswrapper[4781]: E1202 09:49:48.114321 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27151ccc-51de-40b5-8145-6ca697fcf788" containerName="init" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.114334 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="27151ccc-51de-40b5-8145-6ca697fcf788" containerName="init" Dec 02 09:49:48 crc kubenswrapper[4781]: E1202 09:49:48.114351 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="220427cb-65a7-4603-b174-a54fe9bf7ef1" containerName="init" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.114357 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="220427cb-65a7-4603-b174-a54fe9bf7ef1" containerName="init" Dec 02 09:49:48 crc kubenswrapper[4781]: E1202 09:49:48.114394 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="220427cb-65a7-4603-b174-a54fe9bf7ef1" containerName="dnsmasq-dns" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.114402 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="220427cb-65a7-4603-b174-a54fe9bf7ef1" containerName="dnsmasq-dns" Dec 02 09:49:48 crc kubenswrapper[4781]: E1202 09:49:48.114415 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27151ccc-51de-40b5-8145-6ca697fcf788" containerName="dnsmasq-dns" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.114421 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="27151ccc-51de-40b5-8145-6ca697fcf788" containerName="dnsmasq-dns" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.114608 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="27151ccc-51de-40b5-8145-6ca697fcf788" containerName="dnsmasq-dns" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.114625 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="220427cb-65a7-4603-b174-a54fe9bf7ef1" containerName="dnsmasq-dns" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.115352 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.117159 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.117479 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.117658 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.118250 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zl624" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.124429 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk"] Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.171461 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk\" (UID: \"03ba130c-2463-4317-a907-e13c23657ae9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.171533 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk\" (UID: \"03ba130c-2463-4317-a907-e13c23657ae9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.171564 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk\" (UID: \"03ba130c-2463-4317-a907-e13c23657ae9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.171744 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4scc\" (UniqueName: \"kubernetes.io/projected/03ba130c-2463-4317-a907-e13c23657ae9-kube-api-access-v4scc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk\" (UID: \"03ba130c-2463-4317-a907-e13c23657ae9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.273012 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk\" (UID: \"03ba130c-2463-4317-a907-e13c23657ae9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.273076 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk\" (UID: \"03ba130c-2463-4317-a907-e13c23657ae9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.273095 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk\" (UID: \"03ba130c-2463-4317-a907-e13c23657ae9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.273173 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4scc\" (UniqueName: \"kubernetes.io/projected/03ba130c-2463-4317-a907-e13c23657ae9-kube-api-access-v4scc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk\" (UID: \"03ba130c-2463-4317-a907-e13c23657ae9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.278472 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk\" (UID: \"03ba130c-2463-4317-a907-e13c23657ae9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.278836 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk\" (UID: \"03ba130c-2463-4317-a907-e13c23657ae9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.279169 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk\" (UID: \"03ba130c-2463-4317-a907-e13c23657ae9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.289074 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4scc\" (UniqueName: \"kubernetes.io/projected/03ba130c-2463-4317-a907-e13c23657ae9-kube-api-access-v4scc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk\" (UID: \"03ba130c-2463-4317-a907-e13c23657ae9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.441674 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.943180 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk"] Dec 02 09:49:48 crc kubenswrapper[4781]: I1202 09:49:48.948609 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 09:49:49 crc kubenswrapper[4781]: I1202 09:49:49.624333 4781 generic.go:334] "Generic (PLEG): container finished" podID="4c72516d-29b4-4932-8a47-8838d686b176" containerID="e62566f8ba5f5a2d2dfb9fcb9a087ffa140dbf5d9b98cc27756464b4739453e5" exitCode=0 Dec 02 09:49:49 crc kubenswrapper[4781]: I1202 09:49:49.624644 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4c72516d-29b4-4932-8a47-8838d686b176","Type":"ContainerDied","Data":"e62566f8ba5f5a2d2dfb9fcb9a087ffa140dbf5d9b98cc27756464b4739453e5"} Dec 02 09:49:49 crc kubenswrapper[4781]: I1202 09:49:49.632877 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" event={"ID":"03ba130c-2463-4317-a907-e13c23657ae9","Type":"ContainerStarted","Data":"f4dbcbb7e77eab04861aba72d49c1bfd17b29bbbe2e1427dbb6903d2f19e2602"} Dec 02 09:49:50 crc kubenswrapper[4781]: I1202 09:49:50.643485 4781 generic.go:334] "Generic (PLEG): container finished" podID="6f7f1b54-7b32-4063-af85-f97785416d26" containerID="3dacb59056c74de632cc4473faacba4b83a8b0d044b9d8c6d37564f4ed4ad6ff" exitCode=0 Dec 02 09:49:50 crc kubenswrapper[4781]: I1202 09:49:50.643686 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f7f1b54-7b32-4063-af85-f97785416d26","Type":"ContainerDied","Data":"3dacb59056c74de632cc4473faacba4b83a8b0d044b9d8c6d37564f4ed4ad6ff"} Dec 02 09:49:50 crc kubenswrapper[4781]: I1202 09:49:50.646427 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4c72516d-29b4-4932-8a47-8838d686b176","Type":"ContainerStarted","Data":"f21aa2f71159f84575cb5a5dd530960540ac7b764b9ad6f0feba1abe3f47cdc4"} Dec 02 09:49:50 crc kubenswrapper[4781]: I1202 09:49:50.646670 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 02 09:49:50 crc kubenswrapper[4781]: I1202 09:49:50.709368 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.709344588 podStartE2EDuration="36.709344588s" podCreationTimestamp="2025-12-02 09:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:49:50.6993788 +0000 UTC m=+1753.523252689" watchObservedRunningTime="2025-12-02 09:49:50.709344588 +0000 UTC m=+1753.533218467" Dec 02 09:49:51 crc kubenswrapper[4781]: I1202 09:49:51.665188 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f7f1b54-7b32-4063-af85-f97785416d26","Type":"ContainerStarted","Data":"95ff44600eddf2db879cd78f051d0e0864004894259e24e78d3f3fac23dbfba0"} Dec 02 09:49:51 crc kubenswrapper[4781]: I1202 09:49:51.666085 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:49:51 crc kubenswrapper[4781]: I1202 09:49:51.706790 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.706696936 podStartE2EDuration="36.706696936s" podCreationTimestamp="2025-12-02 09:49:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:49:51.690806828 +0000 UTC m=+1754.514680707" watchObservedRunningTime="2025-12-02 09:49:51.706696936 +0000 UTC m=+1754.530570815" Dec 02 09:49:57 crc kubenswrapper[4781]: I1202 09:49:57.508862 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:49:57 crc kubenswrapper[4781]: E1202 09:49:57.509680 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:49:59 crc kubenswrapper[4781]: I1202 09:49:59.751952 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" event={"ID":"03ba130c-2463-4317-a907-e13c23657ae9","Type":"ContainerStarted","Data":"2d3755fcbdb8112de23506a48bce93aa3bf14c7585f7b91613d23cfd6cabd166"} Dec 02 09:49:59 crc kubenswrapper[4781]: I1202 09:49:59.770833 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" podStartSLOduration=1.999692136 podStartE2EDuration="11.770814043s" podCreationTimestamp="2025-12-02 09:49:48 +0000 UTC" firstStartedPulling="2025-12-02 09:49:48.948344472 +0000 UTC m=+1751.772218351" lastFinishedPulling="2025-12-02 09:49:58.719466379 +0000 UTC m=+1761.543340258" observedRunningTime="2025-12-02 09:49:59.768014858 +0000 UTC m=+1762.591888747" watchObservedRunningTime="2025-12-02 09:49:59.770814043 +0000 UTC m=+1762.594687922" Dec 02 09:50:04 crc kubenswrapper[4781]: I1202 09:50:04.629048 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 02 09:50:05 crc kubenswrapper[4781]: I1202 09:50:05.875250 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 02 09:50:08 crc kubenswrapper[4781]: I1202 09:50:08.499778 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:50:08 crc kubenswrapper[4781]: E1202 09:50:08.500434 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:50:10 crc kubenswrapper[4781]: I1202 09:50:10.867071 4781 generic.go:334] "Generic (PLEG): container finished" podID="03ba130c-2463-4317-a907-e13c23657ae9" containerID="2d3755fcbdb8112de23506a48bce93aa3bf14c7585f7b91613d23cfd6cabd166" exitCode=0 Dec 02 09:50:10 crc kubenswrapper[4781]: I1202 09:50:10.867196 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" event={"ID":"03ba130c-2463-4317-a907-e13c23657ae9","Type":"ContainerDied","Data":"2d3755fcbdb8112de23506a48bce93aa3bf14c7585f7b91613d23cfd6cabd166"} Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.280232 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.362426 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-repo-setup-combined-ca-bundle\") pod \"03ba130c-2463-4317-a907-e13c23657ae9\" (UID: \"03ba130c-2463-4317-a907-e13c23657ae9\") " Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.362513 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-ssh-key\") pod \"03ba130c-2463-4317-a907-e13c23657ae9\" (UID: \"03ba130c-2463-4317-a907-e13c23657ae9\") " Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.362536 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-inventory\") pod \"03ba130c-2463-4317-a907-e13c23657ae9\" (UID: \"03ba130c-2463-4317-a907-e13c23657ae9\") " Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.362558 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4scc\" (UniqueName: \"kubernetes.io/projected/03ba130c-2463-4317-a907-e13c23657ae9-kube-api-access-v4scc\") pod \"03ba130c-2463-4317-a907-e13c23657ae9\" (UID: \"03ba130c-2463-4317-a907-e13c23657ae9\") " Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.375939 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "03ba130c-2463-4317-a907-e13c23657ae9" (UID: "03ba130c-2463-4317-a907-e13c23657ae9"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.389472 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03ba130c-2463-4317-a907-e13c23657ae9-kube-api-access-v4scc" (OuterVolumeSpecName: "kube-api-access-v4scc") pod "03ba130c-2463-4317-a907-e13c23657ae9" (UID: "03ba130c-2463-4317-a907-e13c23657ae9"). InnerVolumeSpecName "kube-api-access-v4scc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:50:12 crc kubenswrapper[4781]: E1202 09:50:12.407761 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-ssh-key podName:03ba130c-2463-4317-a907-e13c23657ae9 nodeName:}" failed. No retries permitted until 2025-12-02 09:50:12.9077256 +0000 UTC m=+1775.731599479 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-ssh-key") pod "03ba130c-2463-4317-a907-e13c23657ae9" (UID: "03ba130c-2463-4317-a907-e13c23657ae9") : error deleting /var/lib/kubelet/pods/03ba130c-2463-4317-a907-e13c23657ae9/volume-subpaths: remove /var/lib/kubelet/pods/03ba130c-2463-4317-a907-e13c23657ae9/volume-subpaths: no such file or directory Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.416309 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-inventory" (OuterVolumeSpecName: "inventory") pod "03ba130c-2463-4317-a907-e13c23657ae9" (UID: "03ba130c-2463-4317-a907-e13c23657ae9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.465732 4781 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.465781 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.465797 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4scc\" (UniqueName: \"kubernetes.io/projected/03ba130c-2463-4317-a907-e13c23657ae9-kube-api-access-v4scc\") on node \"crc\" DevicePath \"\"" Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.887002 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" event={"ID":"03ba130c-2463-4317-a907-e13c23657ae9","Type":"ContainerDied","Data":"f4dbcbb7e77eab04861aba72d49c1bfd17b29bbbe2e1427dbb6903d2f19e2602"} Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.887516 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4dbcbb7e77eab04861aba72d49c1bfd17b29bbbe2e1427dbb6903d2f19e2602" Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.887202 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk" Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.966582 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h"] Dec 02 09:50:12 crc kubenswrapper[4781]: E1202 09:50:12.966970 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ba130c-2463-4317-a907-e13c23657ae9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.966989 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ba130c-2463-4317-a907-e13c23657ae9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.967193 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="03ba130c-2463-4317-a907-e13c23657ae9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.967799 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.974354 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-ssh-key\") pod \"03ba130c-2463-4317-a907-e13c23657ae9\" (UID: \"03ba130c-2463-4317-a907-e13c23657ae9\") " Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.977444 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h"] Dec 02 09:50:12 crc kubenswrapper[4781]: I1202 09:50:12.980882 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "03ba130c-2463-4317-a907-e13c23657ae9" (UID: "03ba130c-2463-4317-a907-e13c23657ae9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:50:13 crc kubenswrapper[4781]: I1202 09:50:13.076308 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5lb5h\" (UID: \"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" Dec 02 09:50:13 crc kubenswrapper[4781]: I1202 09:50:13.076381 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5lb5h\" (UID: \"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" Dec 02 09:50:13 crc kubenswrapper[4781]: I1202 09:50:13.076422 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqvmd\" (UniqueName: \"kubernetes.io/projected/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-kube-api-access-qqvmd\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5lb5h\" (UID: \"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" Dec 02 09:50:13 crc kubenswrapper[4781]: I1202 09:50:13.076534 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03ba130c-2463-4317-a907-e13c23657ae9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:50:13 crc kubenswrapper[4781]: I1202 09:50:13.177872 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqvmd\" (UniqueName: \"kubernetes.io/projected/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-kube-api-access-qqvmd\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5lb5h\" (UID: \"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" Dec 02 09:50:13 crc kubenswrapper[4781]: I1202 09:50:13.178321 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5lb5h\" (UID: \"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" Dec 02 09:50:13 crc kubenswrapper[4781]: I1202 09:50:13.178474 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5lb5h\" (UID: \"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" Dec 02 09:50:13 crc kubenswrapper[4781]: I1202 09:50:13.181886 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5lb5h\" (UID: \"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" Dec 02 09:50:13 crc kubenswrapper[4781]: I1202 09:50:13.182096 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5lb5h\" (UID: \"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" Dec 02 09:50:13 crc kubenswrapper[4781]: I1202 09:50:13.195770 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqvmd\" (UniqueName: \"kubernetes.io/projected/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-kube-api-access-qqvmd\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5lb5h\" (UID: \"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" Dec 02 09:50:13 crc kubenswrapper[4781]: I1202 09:50:13.332357 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" Dec 02 09:50:13 crc kubenswrapper[4781]: I1202 09:50:13.887615 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h"] Dec 02 09:50:14 crc kubenswrapper[4781]: I1202 09:50:14.910451 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" event={"ID":"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6","Type":"ContainerStarted","Data":"8897a459fa5de7c51fabcb02698deedcdebbad923fe294c95e7aa9f29b144927"} Dec 02 09:50:14 crc kubenswrapper[4781]: I1202 09:50:14.910817 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" event={"ID":"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6","Type":"ContainerStarted","Data":"89d32ed94aa6cbbc03383cebdb74b2258208da37dee7866931456791ccd5f04f"} Dec 02 09:50:14 crc kubenswrapper[4781]: I1202 09:50:14.933296 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" podStartSLOduration=2.432699092 podStartE2EDuration="2.933280228s" podCreationTimestamp="2025-12-02 09:50:12 +0000 UTC" firstStartedPulling="2025-12-02 09:50:13.894695757 +0000 UTC m=+1776.718569636" lastFinishedPulling="2025-12-02 09:50:14.395276893 +0000 UTC m=+1777.219150772" observedRunningTime="2025-12-02 09:50:14.931768747 +0000 UTC m=+1777.755642626" watchObservedRunningTime="2025-12-02 09:50:14.933280228 +0000 UTC m=+1777.757154107" Dec 02 09:50:17 crc kubenswrapper[4781]: I1202 09:50:17.941733 4781 generic.go:334] "Generic (PLEG): container finished" podID="b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6" containerID="8897a459fa5de7c51fabcb02698deedcdebbad923fe294c95e7aa9f29b144927" exitCode=0 Dec 02 09:50:17 crc kubenswrapper[4781]: I1202 09:50:17.941826 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" event={"ID":"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6","Type":"ContainerDied","Data":"8897a459fa5de7c51fabcb02698deedcdebbad923fe294c95e7aa9f29b144927"} Dec 02 09:50:19 crc kubenswrapper[4781]: I1202 09:50:19.386024 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" Dec 02 09:50:19 crc kubenswrapper[4781]: I1202 09:50:19.498387 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-ssh-key\") pod \"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6\" (UID: \"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6\") " Dec 02 09:50:19 crc kubenswrapper[4781]: I1202 09:50:19.498642 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqvmd\" (UniqueName: \"kubernetes.io/projected/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-kube-api-access-qqvmd\") pod \"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6\" (UID: \"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6\") " Dec 02 09:50:19 crc kubenswrapper[4781]: I1202 09:50:19.498692 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-inventory\") pod \"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6\" (UID: \"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6\") " Dec 02 09:50:19 crc kubenswrapper[4781]: I1202 09:50:19.505882 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-kube-api-access-qqvmd" (OuterVolumeSpecName: "kube-api-access-qqvmd") pod "b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6" (UID: "b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6"). InnerVolumeSpecName "kube-api-access-qqvmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:50:19 crc kubenswrapper[4781]: I1202 09:50:19.531260 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-inventory" (OuterVolumeSpecName: "inventory") pod "b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6" (UID: "b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:50:19 crc kubenswrapper[4781]: I1202 09:50:19.532139 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6" (UID: "b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:50:19 crc kubenswrapper[4781]: I1202 09:50:19.601515 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:50:19 crc kubenswrapper[4781]: I1202 09:50:19.601552 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqvmd\" (UniqueName: \"kubernetes.io/projected/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-kube-api-access-qqvmd\") on node \"crc\" DevicePath \"\"" Dec 02 09:50:19 crc kubenswrapper[4781]: I1202 09:50:19.601564 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:50:19 crc kubenswrapper[4781]: I1202 09:50:19.963216 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" event={"ID":"b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6","Type":"ContainerDied","Data":"89d32ed94aa6cbbc03383cebdb74b2258208da37dee7866931456791ccd5f04f"} Dec 02 09:50:19 crc kubenswrapper[4781]: I1202 09:50:19.963535 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89d32ed94aa6cbbc03383cebdb74b2258208da37dee7866931456791ccd5f04f" Dec 02 09:50:19 crc kubenswrapper[4781]: I1202 09:50:19.963299 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5lb5h" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.051625 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4"] Dec 02 09:50:20 crc kubenswrapper[4781]: E1202 09:50:20.053289 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.053444 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.053742 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.055735 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.059536 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.059727 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zl624" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.059963 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.060112 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.067793 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4"] Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.112431 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4\" (UID: \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.112485 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppxwp\" (UniqueName: \"kubernetes.io/projected/ceabe2f6-ee27-456b-9031-9ebc39e032eb-kube-api-access-ppxwp\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4\" (UID: \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.112655 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4\" (UID: \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.112679 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4\" (UID: \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.214788 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4\" (UID: \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.214848 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4\" (UID: \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.214991 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4\" (UID: \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.215019 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppxwp\" (UniqueName: \"kubernetes.io/projected/ceabe2f6-ee27-456b-9031-9ebc39e032eb-kube-api-access-ppxwp\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4\" (UID: \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.219312 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4\" (UID: \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.219549 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4\" (UID: \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.227668 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4\" (UID: \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.234371 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppxwp\" (UniqueName: \"kubernetes.io/projected/ceabe2f6-ee27-456b-9031-9ebc39e032eb-kube-api-access-ppxwp\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4\" (UID: \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.393194 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.911450 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4"] Dec 02 09:50:20 crc kubenswrapper[4781]: I1202 09:50:20.972190 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" event={"ID":"ceabe2f6-ee27-456b-9031-9ebc39e032eb","Type":"ContainerStarted","Data":"7d0c1a38a92c2e252071f66921234a4fb829bc87cadbd37cad36dda336c68ab4"} Dec 02 09:50:21 crc kubenswrapper[4781]: I1202 09:50:21.982582 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" event={"ID":"ceabe2f6-ee27-456b-9031-9ebc39e032eb","Type":"ContainerStarted","Data":"de80eb04d87e9f92975b950652c49265144aaa858ff8e830866aac29a70500a1"} Dec 02 09:50:22 crc kubenswrapper[4781]: I1202 09:50:21.999772 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" podStartSLOduration=1.545924872 podStartE2EDuration="1.999727576s" podCreationTimestamp="2025-12-02 09:50:20 +0000 UTC" firstStartedPulling="2025-12-02 09:50:20.920349567 +0000 UTC m=+1783.744223456" lastFinishedPulling="2025-12-02 09:50:21.374152281 +0000 UTC m=+1784.198026160" observedRunningTime="2025-12-02 09:50:21.998133233 +0000 UTC m=+1784.822007162" watchObservedRunningTime="2025-12-02 09:50:21.999727576 +0000 UTC m=+1784.823601455" Dec 02 09:50:23 crc kubenswrapper[4781]: I1202 09:50:23.501237 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:50:23 crc kubenswrapper[4781]: E1202 09:50:23.502062 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:50:36 crc kubenswrapper[4781]: I1202 09:50:36.499628 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:50:36 crc kubenswrapper[4781]: E1202 09:50:36.500409 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:50:42 crc kubenswrapper[4781]: I1202 09:50:42.419839 4781 scope.go:117] "RemoveContainer" containerID="a4eec7dea74df0df1e8324cabc97265a7aa624283371aeb6a5533d9215c73d98" Dec 02 09:50:50 crc kubenswrapper[4781]: I1202 09:50:50.499506 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:50:50 crc kubenswrapper[4781]: E1202 09:50:50.500300 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:51:04 crc kubenswrapper[4781]: I1202 09:51:04.500466 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:51:04 crc kubenswrapper[4781]: E1202 09:51:04.501313 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:51:17 crc kubenswrapper[4781]: I1202 09:51:17.505345 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:51:17 crc kubenswrapper[4781]: E1202 09:51:17.506207 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:51:30 crc kubenswrapper[4781]: I1202 09:51:30.499678 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:51:30 crc kubenswrapper[4781]: E1202 09:51:30.500444 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:51:42 crc kubenswrapper[4781]: I1202 09:51:42.499273 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:51:42 crc kubenswrapper[4781]: E1202 09:51:42.500219 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:51:42 crc kubenswrapper[4781]: I1202 09:51:42.503548 4781 scope.go:117] "RemoveContainer" containerID="8d83e5fd74e7de7d8cc031d27e869372a76d4e0f82ffc074e799d9426b66906b" Dec 02 09:51:42 crc kubenswrapper[4781]: I1202 09:51:42.535210 4781 scope.go:117] "RemoveContainer" containerID="a6ebccae71e14ab79a993a9b2755042092cbcac23dfc21c29f989ad2a993c7fd" Dec 02 09:51:57 crc kubenswrapper[4781]: I1202 09:51:57.507657 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:51:57 crc kubenswrapper[4781]: E1202 09:51:57.508525 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:52:12 crc kubenswrapper[4781]: I1202 09:52:12.500093 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:52:12 crc kubenswrapper[4781]: E1202 09:52:12.500948 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:52:26 crc kubenswrapper[4781]: I1202 09:52:26.499722 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:52:26 crc kubenswrapper[4781]: E1202 09:52:26.500673 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:52:38 crc kubenswrapper[4781]: I1202 09:52:38.499897 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:52:38 crc kubenswrapper[4781]: E1202 09:52:38.500801 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:52:48 crc kubenswrapper[4781]: I1202 09:52:48.044755 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b513-account-create-update-pn5wp"] Dec 02 09:52:48 crc kubenswrapper[4781]: I1202 09:52:48.052855 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b513-account-create-update-pn5wp"] Dec 02 09:52:49 crc kubenswrapper[4781]: I1202 09:52:49.026741 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8211-account-create-update-gpv69"] Dec 02 09:52:49 crc kubenswrapper[4781]: I1202 09:52:49.038728 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-2nd8t"] Dec 02 09:52:49 crc kubenswrapper[4781]: I1202 09:52:49.047143 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-8528x"] Dec 02 09:52:49 crc kubenswrapper[4781]: I1202 09:52:49.054802 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8211-account-create-update-gpv69"] Dec 02 09:52:49 crc kubenswrapper[4781]: I1202 09:52:49.062732 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-8528x"] Dec 02 09:52:49 crc kubenswrapper[4781]: I1202 09:52:49.071260 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-2nd8t"] Dec 02 09:52:49 crc kubenswrapper[4781]: I1202 09:52:49.509749 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43441d30-70a7-429c-a674-51db8c4e8111" path="/var/lib/kubelet/pods/43441d30-70a7-429c-a674-51db8c4e8111/volumes" Dec 02 09:52:49 crc kubenswrapper[4781]: I1202 09:52:49.510487 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4686994b-2137-4fbc-b470-32bfc5296f79" path="/var/lib/kubelet/pods/4686994b-2137-4fbc-b470-32bfc5296f79/volumes" Dec 02 09:52:49 crc kubenswrapper[4781]: I1202 09:52:49.511189 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf726ecc-d195-4c00-ad77-23f0875c6f2e" path="/var/lib/kubelet/pods/bf726ecc-d195-4c00-ad77-23f0875c6f2e/volumes" Dec 02 09:52:49 crc kubenswrapper[4781]: I1202 09:52:49.511917 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f710b218-cbee-44a5-b8f3-80b4b2210cd8" path="/var/lib/kubelet/pods/f710b218-cbee-44a5-b8f3-80b4b2210cd8/volumes" Dec 02 09:52:50 crc kubenswrapper[4781]: I1202 09:52:50.500334 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:52:50 crc kubenswrapper[4781]: E1202 09:52:50.501066 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:52:53 crc kubenswrapper[4781]: I1202 09:52:53.026254 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-s6nvd"] Dec 02 09:52:53 crc kubenswrapper[4781]: I1202 09:52:53.036739 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-s6nvd"] Dec 02 09:52:53 crc kubenswrapper[4781]: I1202 09:52:53.511203 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ae7a228-a973-4207-9556-debe4ae4eb2a" path="/var/lib/kubelet/pods/6ae7a228-a973-4207-9556-debe4ae4eb2a/volumes" Dec 02 09:52:54 crc kubenswrapper[4781]: I1202 09:52:54.031426 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-0ff3-account-create-update-4ncb9"] Dec 02 09:52:54 crc kubenswrapper[4781]: I1202 09:52:54.043049 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-0ff3-account-create-update-4ncb9"] Dec 02 09:52:55 crc kubenswrapper[4781]: I1202 09:52:55.510291 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17abf61f-c972-4ab9-87db-2083b58f69fa" path="/var/lib/kubelet/pods/17abf61f-c972-4ab9-87db-2083b58f69fa/volumes" Dec 02 09:53:03 crc kubenswrapper[4781]: I1202 09:53:03.500319 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:53:04 crc kubenswrapper[4781]: I1202 09:53:04.013890 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"05bb140e8bed83dcb63b9b78720e5b065dcdd65808d271a4b06533687d8cc9f3"} Dec 02 09:53:29 crc kubenswrapper[4781]: I1202 09:53:29.244180 4781 generic.go:334] "Generic (PLEG): container finished" podID="ceabe2f6-ee27-456b-9031-9ebc39e032eb" containerID="de80eb04d87e9f92975b950652c49265144aaa858ff8e830866aac29a70500a1" exitCode=0 Dec 02 09:53:29 crc kubenswrapper[4781]: I1202 09:53:29.244272 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" event={"ID":"ceabe2f6-ee27-456b-9031-9ebc39e032eb","Type":"ContainerDied","Data":"de80eb04d87e9f92975b950652c49265144aaa858ff8e830866aac29a70500a1"} Dec 02 09:53:30 crc kubenswrapper[4781]: I1202 09:53:30.655119 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" Dec 02 09:53:30 crc kubenswrapper[4781]: I1202 09:53:30.791361 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppxwp\" (UniqueName: \"kubernetes.io/projected/ceabe2f6-ee27-456b-9031-9ebc39e032eb-kube-api-access-ppxwp\") pod \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\" (UID: \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\") " Dec 02 09:53:30 crc kubenswrapper[4781]: I1202 09:53:30.791539 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-ssh-key\") pod \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\" (UID: \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\") " Dec 02 09:53:30 crc kubenswrapper[4781]: I1202 09:53:30.791606 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-inventory\") pod \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\" (UID: \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\") " Dec 02 09:53:30 crc kubenswrapper[4781]: I1202 09:53:30.791666 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-bootstrap-combined-ca-bundle\") pod \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\" (UID: \"ceabe2f6-ee27-456b-9031-9ebc39e032eb\") " Dec 02 09:53:30 crc kubenswrapper[4781]: I1202 09:53:30.797411 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceabe2f6-ee27-456b-9031-9ebc39e032eb-kube-api-access-ppxwp" (OuterVolumeSpecName: "kube-api-access-ppxwp") pod "ceabe2f6-ee27-456b-9031-9ebc39e032eb" (UID: "ceabe2f6-ee27-456b-9031-9ebc39e032eb"). InnerVolumeSpecName "kube-api-access-ppxwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:53:30 crc kubenswrapper[4781]: I1202 09:53:30.797591 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ceabe2f6-ee27-456b-9031-9ebc39e032eb" (UID: "ceabe2f6-ee27-456b-9031-9ebc39e032eb"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:53:30 crc kubenswrapper[4781]: I1202 09:53:30.822504 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-inventory" (OuterVolumeSpecName: "inventory") pod "ceabe2f6-ee27-456b-9031-9ebc39e032eb" (UID: "ceabe2f6-ee27-456b-9031-9ebc39e032eb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:53:30 crc kubenswrapper[4781]: I1202 09:53:30.822691 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ceabe2f6-ee27-456b-9031-9ebc39e032eb" (UID: "ceabe2f6-ee27-456b-9031-9ebc39e032eb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:53:30 crc kubenswrapper[4781]: I1202 09:53:30.894400 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:53:30 crc kubenswrapper[4781]: I1202 09:53:30.894480 4781 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:53:30 crc kubenswrapper[4781]: I1202 09:53:30.894495 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppxwp\" (UniqueName: \"kubernetes.io/projected/ceabe2f6-ee27-456b-9031-9ebc39e032eb-kube-api-access-ppxwp\") on node \"crc\" DevicePath \"\"" Dec 02 09:53:30 crc kubenswrapper[4781]: I1202 09:53:30.894505 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ceabe2f6-ee27-456b-9031-9ebc39e032eb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.263195 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" event={"ID":"ceabe2f6-ee27-456b-9031-9ebc39e032eb","Type":"ContainerDied","Data":"7d0c1a38a92c2e252071f66921234a4fb829bc87cadbd37cad36dda336c68ab4"} Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.263526 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d0c1a38a92c2e252071f66921234a4fb829bc87cadbd37cad36dda336c68ab4" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.263276 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.340757 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf"] Dec 02 09:53:31 crc kubenswrapper[4781]: E1202 09:53:31.341210 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceabe2f6-ee27-456b-9031-9ebc39e032eb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.341229 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceabe2f6-ee27-456b-9031-9ebc39e032eb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.341430 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceabe2f6-ee27-456b-9031-9ebc39e032eb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.342031 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.343910 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.349413 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zl624" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.349476 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.349541 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.355564 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf"] Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.503951 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf\" (UID: \"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.504032 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf\" (UID: \"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.504081 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tngs\" (UniqueName: \"kubernetes.io/projected/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-kube-api-access-7tngs\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf\" (UID: \"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.605801 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf\" (UID: \"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.605879 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf\" (UID: \"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.605945 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tngs\" (UniqueName: \"kubernetes.io/projected/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-kube-api-access-7tngs\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf\" (UID: \"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.611556 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf\" (UID: \"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.619204 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf\" (UID: \"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.622891 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tngs\" (UniqueName: \"kubernetes.io/projected/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-kube-api-access-7tngs\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf\" (UID: \"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" Dec 02 09:53:31 crc kubenswrapper[4781]: I1202 09:53:31.709499 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" Dec 02 09:53:32 crc kubenswrapper[4781]: W1202 09:53:32.251324 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8804a3f3_f23a_4a85_8e45_9f92f90c5e9b.slice/crio-9a552e3d8f6d4e8d778ee226c88a6f1074347b59fbd61c3f1c819ead4f1cbe46 WatchSource:0}: Error finding container 9a552e3d8f6d4e8d778ee226c88a6f1074347b59fbd61c3f1c819ead4f1cbe46: Status 404 returned error can't find the container with id 9a552e3d8f6d4e8d778ee226c88a6f1074347b59fbd61c3f1c819ead4f1cbe46 Dec 02 09:53:32 crc kubenswrapper[4781]: I1202 09:53:32.253509 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf"] Dec 02 09:53:32 crc kubenswrapper[4781]: I1202 09:53:32.278733 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" event={"ID":"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b","Type":"ContainerStarted","Data":"9a552e3d8f6d4e8d778ee226c88a6f1074347b59fbd61c3f1c819ead4f1cbe46"} Dec 02 09:53:33 crc kubenswrapper[4781]: I1202 09:53:33.290696 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" event={"ID":"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b","Type":"ContainerStarted","Data":"b1443e03dcb72c5b40607b9cd99b32b21ac48a68dba31ff2666d8f5a1bfd24df"} Dec 02 09:53:33 crc kubenswrapper[4781]: I1202 09:53:33.310884 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" podStartSLOduration=1.736801794 podStartE2EDuration="2.310862919s" podCreationTimestamp="2025-12-02 09:53:31 +0000 UTC" firstStartedPulling="2025-12-02 09:53:32.254498211 +0000 UTC m=+1975.078372090" lastFinishedPulling="2025-12-02 09:53:32.828559336 +0000 UTC m=+1975.652433215" observedRunningTime="2025-12-02 09:53:33.306857801 +0000 UTC m=+1976.130731680" watchObservedRunningTime="2025-12-02 09:53:33.310862919 +0000 UTC m=+1976.134736798" Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.052501 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-l8cvh"] Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.064300 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a8e2-account-create-update-g5tjv"] Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.075860 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-3506-account-create-update-qrvct"] Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.085801 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-l8cvh"] Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.094636 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-21eb-account-create-update-7w4wm"] Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.102685 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-78xmv"] Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.110229 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a8e2-account-create-update-g5tjv"] Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.118976 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-6tfv2"] Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.125738 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-21eb-account-create-update-7w4wm"] Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.134437 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-3506-account-create-update-qrvct"] Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.155198 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-6tfv2"] Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.163710 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-78xmv"] Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.511813 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="158568be-9a60-4a86-b480-e5c90939ed09" path="/var/lib/kubelet/pods/158568be-9a60-4a86-b480-e5c90939ed09/volumes" Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.512447 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8118db55-54ba-4cf2-b80d-266872f87896" path="/var/lib/kubelet/pods/8118db55-54ba-4cf2-b80d-266872f87896/volumes" Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.512990 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae522302-ae88-485a-a767-d6a6c6bf5205" path="/var/lib/kubelet/pods/ae522302-ae88-485a-a767-d6a6c6bf5205/volumes" Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.513550 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c457c92b-cc50-4e99-bf06-81d1b6cf3a7c" path="/var/lib/kubelet/pods/c457c92b-cc50-4e99-bf06-81d1b6cf3a7c/volumes" Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.514761 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8b40bb3-6c5d-47d5-9e15-439119be130a" path="/var/lib/kubelet/pods/d8b40bb3-6c5d-47d5-9e15-439119be130a/volumes" Dec 02 09:53:35 crc kubenswrapper[4781]: I1202 09:53:35.515298 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7ff12ef-2f31-42e6-b5e0-8b3172ac738b" path="/var/lib/kubelet/pods/f7ff12ef-2f31-42e6-b5e0-8b3172ac738b/volumes" Dec 02 09:53:42 crc kubenswrapper[4781]: I1202 09:53:42.608053 4781 scope.go:117] "RemoveContainer" containerID="fa1c488c26c5961e60ec3cafe487ff7fdc7054a9dda307611a836a3ba3a14ad6" Dec 02 09:53:42 crc kubenswrapper[4781]: I1202 09:53:42.636417 4781 scope.go:117] "RemoveContainer" containerID="3baa9c693a15be661f8c3833e3d3e240d8ea0df16b0941e949351c88151ebb43" Dec 02 09:53:42 crc kubenswrapper[4781]: I1202 09:53:42.682620 4781 scope.go:117] "RemoveContainer" containerID="97b9ed60bb35c6c89587bac81f51c63e7a90c5f69f28dd0c0e327305537db1be" Dec 02 09:53:42 crc kubenswrapper[4781]: I1202 09:53:42.733231 4781 scope.go:117] "RemoveContainer" containerID="88094bbbfde71ba9ceaedf7924eccbdc0912eebcc7f6266f3042f790af408615" Dec 02 09:53:42 crc kubenswrapper[4781]: I1202 09:53:42.756590 4781 scope.go:117] "RemoveContainer" containerID="033c113454daf9fbb63a0f625966cc476d0d7c0541f94bd77b4ce6c94921b97c" Dec 02 09:53:42 crc kubenswrapper[4781]: I1202 09:53:42.800393 4781 scope.go:117] "RemoveContainer" containerID="a6c37db8446be81eea490f38cf259d36dfe3264288300d2f3007d323fd5cc8b5" Dec 02 09:53:42 crc kubenswrapper[4781]: I1202 09:53:42.852027 4781 scope.go:117] "RemoveContainer" containerID="9c5e32171b18f60b0a0543666ce99fdfae3aad608a03d90c9b2bf411e50e7169" Dec 02 09:53:42 crc kubenswrapper[4781]: I1202 09:53:42.902530 4781 scope.go:117] "RemoveContainer" containerID="83b98bad06a8544fdedb2b6c741c5a2e67611df56bd1ddea50be40fbf9211644" Dec 02 09:53:42 crc kubenswrapper[4781]: I1202 09:53:42.928602 4781 scope.go:117] "RemoveContainer" containerID="908f120c3659c413a9d78d9d012afda775dc13af53b354e176171cfbbf0b676d" Dec 02 09:53:42 crc kubenswrapper[4781]: I1202 09:53:42.967461 4781 scope.go:117] "RemoveContainer" containerID="562daa81b295a8b189f4611d5790fad69d2756ac33da3345e4cd29fdf13d4a85" Dec 02 09:53:42 crc kubenswrapper[4781]: I1202 09:53:42.988238 4781 scope.go:117] "RemoveContainer" containerID="de0d2ce2c109d3d641a30bcad85af2a13f3b19979f8ac1c0b0e6200e2cbcd6c6" Dec 02 09:53:43 crc kubenswrapper[4781]: I1202 09:53:43.011824 4781 scope.go:117] "RemoveContainer" containerID="e7a11ac23e9b0a8d1da3f8811f5b760fa5c3def063c432682a58c7a905f2911b" Dec 02 09:53:43 crc kubenswrapper[4781]: I1202 09:53:43.038450 4781 scope.go:117] "RemoveContainer" containerID="e0ea675c144f0a00e40bfdbd2b47f08de472673f29409e4eef5c3833990d0dfa" Dec 02 09:53:43 crc kubenswrapper[4781]: I1202 09:53:43.039408 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-5mr6j"] Dec 02 09:53:43 crc kubenswrapper[4781]: I1202 09:53:43.056147 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-5mr6j"] Dec 02 09:53:43 crc kubenswrapper[4781]: I1202 09:53:43.063977 4781 scope.go:117] "RemoveContainer" containerID="40a9d5823d43b57ee3824f88c2f5cbadff07261c55ee9724a63a6c826baef21f" Dec 02 09:53:43 crc kubenswrapper[4781]: I1202 09:53:43.087023 4781 scope.go:117] "RemoveContainer" containerID="4e640be2564cfee313d747374008eb2b2a9fb52ecb1c9eb6f1f7bc76090fe5cc" Dec 02 09:53:43 crc kubenswrapper[4781]: I1202 09:53:43.510445 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2075ad57-12d9-47c4-80ba-5bf9e1bce693" path="/var/lib/kubelet/pods/2075ad57-12d9-47c4-80ba-5bf9e1bce693/volumes" Dec 02 09:54:43 crc kubenswrapper[4781]: I1202 09:54:43.384680 4781 scope.go:117] "RemoveContainer" containerID="bd092bc04703af445a96d0dc6ffd7109012effad2faa8cdf4308be22814ad920" Dec 02 09:54:43 crc kubenswrapper[4781]: I1202 09:54:43.409677 4781 scope.go:117] "RemoveContainer" containerID="ef77710f70f761d7af1694d2e262bc01b7d43819a9686336064d20e7b0d27d2a" Dec 02 09:54:43 crc kubenswrapper[4781]: I1202 09:54:43.447289 4781 scope.go:117] "RemoveContainer" containerID="08a35e39a052bff33bcbe0911370ab32856466d1eb578f0adc35360e7e231e4d" Dec 02 09:54:43 crc kubenswrapper[4781]: I1202 09:54:43.938098 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rv6xb"] Dec 02 09:54:43 crc kubenswrapper[4781]: I1202 09:54:43.940424 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rv6xb" Dec 02 09:54:43 crc kubenswrapper[4781]: I1202 09:54:43.969347 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rv6xb"] Dec 02 09:54:44 crc kubenswrapper[4781]: I1202 09:54:44.120218 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e7b109-cde2-4b4f-8f1c-2d7940a49e1c-catalog-content\") pod \"community-operators-rv6xb\" (UID: \"10e7b109-cde2-4b4f-8f1c-2d7940a49e1c\") " pod="openshift-marketplace/community-operators-rv6xb" Dec 02 09:54:44 crc kubenswrapper[4781]: I1202 09:54:44.120739 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e7b109-cde2-4b4f-8f1c-2d7940a49e1c-utilities\") pod \"community-operators-rv6xb\" (UID: \"10e7b109-cde2-4b4f-8f1c-2d7940a49e1c\") " pod="openshift-marketplace/community-operators-rv6xb" Dec 02 09:54:44 crc kubenswrapper[4781]: I1202 09:54:44.120841 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjjmr\" (UniqueName: \"kubernetes.io/projected/10e7b109-cde2-4b4f-8f1c-2d7940a49e1c-kube-api-access-hjjmr\") pod \"community-operators-rv6xb\" (UID: \"10e7b109-cde2-4b4f-8f1c-2d7940a49e1c\") " pod="openshift-marketplace/community-operators-rv6xb" Dec 02 09:54:44 crc kubenswrapper[4781]: I1202 09:54:44.222873 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e7b109-cde2-4b4f-8f1c-2d7940a49e1c-utilities\") pod \"community-operators-rv6xb\" (UID: \"10e7b109-cde2-4b4f-8f1c-2d7940a49e1c\") " pod="openshift-marketplace/community-operators-rv6xb" Dec 02 09:54:44 crc kubenswrapper[4781]: I1202 09:54:44.222960 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjjmr\" (UniqueName: \"kubernetes.io/projected/10e7b109-cde2-4b4f-8f1c-2d7940a49e1c-kube-api-access-hjjmr\") pod \"community-operators-rv6xb\" (UID: \"10e7b109-cde2-4b4f-8f1c-2d7940a49e1c\") " pod="openshift-marketplace/community-operators-rv6xb" Dec 02 09:54:44 crc kubenswrapper[4781]: I1202 09:54:44.223159 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e7b109-cde2-4b4f-8f1c-2d7940a49e1c-catalog-content\") pod \"community-operators-rv6xb\" (UID: \"10e7b109-cde2-4b4f-8f1c-2d7940a49e1c\") " pod="openshift-marketplace/community-operators-rv6xb" Dec 02 09:54:44 crc kubenswrapper[4781]: I1202 09:54:44.223406 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e7b109-cde2-4b4f-8f1c-2d7940a49e1c-utilities\") pod \"community-operators-rv6xb\" (UID: \"10e7b109-cde2-4b4f-8f1c-2d7940a49e1c\") " pod="openshift-marketplace/community-operators-rv6xb" Dec 02 09:54:44 crc kubenswrapper[4781]: I1202 09:54:44.223461 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e7b109-cde2-4b4f-8f1c-2d7940a49e1c-catalog-content\") pod \"community-operators-rv6xb\" (UID: \"10e7b109-cde2-4b4f-8f1c-2d7940a49e1c\") " pod="openshift-marketplace/community-operators-rv6xb" Dec 02 09:54:44 crc kubenswrapper[4781]: I1202 09:54:44.249052 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjjmr\" (UniqueName: \"kubernetes.io/projected/10e7b109-cde2-4b4f-8f1c-2d7940a49e1c-kube-api-access-hjjmr\") pod \"community-operators-rv6xb\" (UID: \"10e7b109-cde2-4b4f-8f1c-2d7940a49e1c\") " pod="openshift-marketplace/community-operators-rv6xb" Dec 02 09:54:44 crc kubenswrapper[4781]: I1202 09:54:44.269386 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rv6xb" Dec 02 09:54:44 crc kubenswrapper[4781]: I1202 09:54:44.806705 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rv6xb"] Dec 02 09:54:44 crc kubenswrapper[4781]: I1202 09:54:44.930182 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rv6xb" event={"ID":"10e7b109-cde2-4b4f-8f1c-2d7940a49e1c","Type":"ContainerStarted","Data":"232ab09e1558605df1f539293ab6d3ed4d048b694ee94a0601cdebf6bfb501be"} Dec 02 09:54:45 crc kubenswrapper[4781]: I1202 09:54:45.941326 4781 generic.go:334] "Generic (PLEG): container finished" podID="10e7b109-cde2-4b4f-8f1c-2d7940a49e1c" containerID="8e16294bef54078b5c4506f1d136872ae86aec485716359067e97408ff1ca779" exitCode=0 Dec 02 09:54:45 crc kubenswrapper[4781]: I1202 09:54:45.941364 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rv6xb" event={"ID":"10e7b109-cde2-4b4f-8f1c-2d7940a49e1c","Type":"ContainerDied","Data":"8e16294bef54078b5c4506f1d136872ae86aec485716359067e97408ff1ca779"} Dec 02 09:54:46 crc kubenswrapper[4781]: I1202 09:54:46.041115 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-sqn2b"] Dec 02 09:54:46 crc kubenswrapper[4781]: I1202 09:54:46.051645 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-t6bqc"] Dec 02 09:54:46 crc kubenswrapper[4781]: I1202 09:54:46.060043 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-sqn2b"] Dec 02 09:54:46 crc kubenswrapper[4781]: I1202 09:54:46.067941 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-t6bqc"] Dec 02 09:54:47 crc kubenswrapper[4781]: I1202 09:54:47.026414 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-m7kpn"] Dec 02 09:54:47 crc kubenswrapper[4781]: I1202 09:54:47.033635 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-m7kpn"] Dec 02 09:54:47 crc kubenswrapper[4781]: I1202 09:54:47.513707 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="855c82ea-ccc1-4981-b82d-b2d9aac387a6" path="/var/lib/kubelet/pods/855c82ea-ccc1-4981-b82d-b2d9aac387a6/volumes" Dec 02 09:54:47 crc kubenswrapper[4781]: I1202 09:54:47.515028 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af2e4c8d-f431-487f-8b70-a2b6e6ee6000" path="/var/lib/kubelet/pods/af2e4c8d-f431-487f-8b70-a2b6e6ee6000/volumes" Dec 02 09:54:47 crc kubenswrapper[4781]: I1202 09:54:47.516076 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea3e7049-869e-4822-a11e-9eb2df6e3eeb" path="/var/lib/kubelet/pods/ea3e7049-869e-4822-a11e-9eb2df6e3eeb/volumes" Dec 02 09:54:54 crc kubenswrapper[4781]: I1202 09:54:54.017320 4781 generic.go:334] "Generic (PLEG): container finished" podID="10e7b109-cde2-4b4f-8f1c-2d7940a49e1c" containerID="ef0c0860745181ab0e18d43ea6be195dedd7869647abffe81ec38c7a7ed833a5" exitCode=0 Dec 02 09:54:54 crc kubenswrapper[4781]: I1202 09:54:54.017359 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rv6xb" event={"ID":"10e7b109-cde2-4b4f-8f1c-2d7940a49e1c","Type":"ContainerDied","Data":"ef0c0860745181ab0e18d43ea6be195dedd7869647abffe81ec38c7a7ed833a5"} Dec 02 09:54:54 crc kubenswrapper[4781]: I1202 09:54:54.020375 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 09:54:55 crc kubenswrapper[4781]: I1202 09:54:55.028672 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rv6xb" event={"ID":"10e7b109-cde2-4b4f-8f1c-2d7940a49e1c","Type":"ContainerStarted","Data":"0009f551f4a5c73aadeab0ba660a0298b398efa0450f4ad38241806498155a81"} Dec 02 09:54:55 crc kubenswrapper[4781]: I1202 09:54:55.046940 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rv6xb" podStartSLOduration=3.475464261 podStartE2EDuration="12.046910438s" podCreationTimestamp="2025-12-02 09:54:43 +0000 UTC" firstStartedPulling="2025-12-02 09:54:45.943306915 +0000 UTC m=+2048.767180784" lastFinishedPulling="2025-12-02 09:54:54.514753082 +0000 UTC m=+2057.338626961" observedRunningTime="2025-12-02 09:54:55.045782497 +0000 UTC m=+2057.869656376" watchObservedRunningTime="2025-12-02 09:54:55.046910438 +0000 UTC m=+2057.870784327" Dec 02 09:55:04 crc kubenswrapper[4781]: I1202 09:55:04.270817 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rv6xb" Dec 02 09:55:04 crc kubenswrapper[4781]: I1202 09:55:04.272423 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rv6xb" Dec 02 09:55:04 crc kubenswrapper[4781]: I1202 09:55:04.317777 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rv6xb" Dec 02 09:55:05 crc kubenswrapper[4781]: I1202 09:55:05.164890 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rv6xb" Dec 02 09:55:05 crc kubenswrapper[4781]: I1202 09:55:05.229254 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rv6xb"] Dec 02 09:55:05 crc kubenswrapper[4781]: I1202 09:55:05.274827 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kbzzx"] Dec 02 09:55:05 crc kubenswrapper[4781]: I1202 09:55:05.275116 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kbzzx" podUID="2af2414f-0f9f-418b-b807-4362bf6ee700" containerName="registry-server" containerID="cri-o://86249df209579fc2a9dd3fe2274a019994e5f5df37fcb3429112b3fca07bd1b9" gracePeriod=2 Dec 02 09:55:05 crc kubenswrapper[4781]: I1202 09:55:05.774098 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbzzx" Dec 02 09:55:05 crc kubenswrapper[4781]: I1202 09:55:05.860620 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af2414f-0f9f-418b-b807-4362bf6ee700-catalog-content\") pod \"2af2414f-0f9f-418b-b807-4362bf6ee700\" (UID: \"2af2414f-0f9f-418b-b807-4362bf6ee700\") " Dec 02 09:55:05 crc kubenswrapper[4781]: I1202 09:55:05.860797 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtfnk\" (UniqueName: \"kubernetes.io/projected/2af2414f-0f9f-418b-b807-4362bf6ee700-kube-api-access-gtfnk\") pod \"2af2414f-0f9f-418b-b807-4362bf6ee700\" (UID: \"2af2414f-0f9f-418b-b807-4362bf6ee700\") " Dec 02 09:55:05 crc kubenswrapper[4781]: I1202 09:55:05.860855 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af2414f-0f9f-418b-b807-4362bf6ee700-utilities\") pod \"2af2414f-0f9f-418b-b807-4362bf6ee700\" (UID: \"2af2414f-0f9f-418b-b807-4362bf6ee700\") " Dec 02 09:55:05 crc kubenswrapper[4781]: I1202 09:55:05.861497 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2af2414f-0f9f-418b-b807-4362bf6ee700-utilities" (OuterVolumeSpecName: "utilities") pod "2af2414f-0f9f-418b-b807-4362bf6ee700" (UID: "2af2414f-0f9f-418b-b807-4362bf6ee700"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:55:05 crc kubenswrapper[4781]: I1202 09:55:05.869090 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af2414f-0f9f-418b-b807-4362bf6ee700-kube-api-access-gtfnk" (OuterVolumeSpecName: "kube-api-access-gtfnk") pod "2af2414f-0f9f-418b-b807-4362bf6ee700" (UID: "2af2414f-0f9f-418b-b807-4362bf6ee700"). InnerVolumeSpecName "kube-api-access-gtfnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:55:05 crc kubenswrapper[4781]: I1202 09:55:05.915909 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2af2414f-0f9f-418b-b807-4362bf6ee700-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2af2414f-0f9f-418b-b807-4362bf6ee700" (UID: "2af2414f-0f9f-418b-b807-4362bf6ee700"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:55:05 crc kubenswrapper[4781]: I1202 09:55:05.963261 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af2414f-0f9f-418b-b807-4362bf6ee700-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:05 crc kubenswrapper[4781]: I1202 09:55:05.963302 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtfnk\" (UniqueName: \"kubernetes.io/projected/2af2414f-0f9f-418b-b807-4362bf6ee700-kube-api-access-gtfnk\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:05 crc kubenswrapper[4781]: I1202 09:55:05.963317 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af2414f-0f9f-418b-b807-4362bf6ee700-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:06 crc kubenswrapper[4781]: I1202 09:55:06.128978 4781 generic.go:334] "Generic (PLEG): container finished" podID="2af2414f-0f9f-418b-b807-4362bf6ee700" containerID="86249df209579fc2a9dd3fe2274a019994e5f5df37fcb3429112b3fca07bd1b9" exitCode=0 Dec 02 09:55:06 crc kubenswrapper[4781]: I1202 09:55:06.129030 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbzzx" Dec 02 09:55:06 crc kubenswrapper[4781]: I1202 09:55:06.129032 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbzzx" event={"ID":"2af2414f-0f9f-418b-b807-4362bf6ee700","Type":"ContainerDied","Data":"86249df209579fc2a9dd3fe2274a019994e5f5df37fcb3429112b3fca07bd1b9"} Dec 02 09:55:06 crc kubenswrapper[4781]: I1202 09:55:06.129095 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbzzx" event={"ID":"2af2414f-0f9f-418b-b807-4362bf6ee700","Type":"ContainerDied","Data":"15e36dc6a647f1b089d80cc23fe6b048035af785b3d001bf7895081916dc3b15"} Dec 02 09:55:06 crc kubenswrapper[4781]: I1202 09:55:06.129123 4781 scope.go:117] "RemoveContainer" containerID="86249df209579fc2a9dd3fe2274a019994e5f5df37fcb3429112b3fca07bd1b9" Dec 02 09:55:06 crc kubenswrapper[4781]: I1202 09:55:06.150679 4781 scope.go:117] "RemoveContainer" containerID="75c16bb8b86c860ba3a6dbda2c1f884b5dae83d5fcd264536c938155aa61b7a8" Dec 02 09:55:06 crc kubenswrapper[4781]: I1202 09:55:06.175280 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kbzzx"] Dec 02 09:55:06 crc kubenswrapper[4781]: I1202 09:55:06.184738 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kbzzx"] Dec 02 09:55:06 crc kubenswrapper[4781]: I1202 09:55:06.198138 4781 scope.go:117] "RemoveContainer" containerID="27d630f68eab5f05abb6a19fee812276154e64a860c5fb747d060e5b0a939587" Dec 02 09:55:06 crc kubenswrapper[4781]: I1202 09:55:06.225974 4781 scope.go:117] "RemoveContainer" containerID="86249df209579fc2a9dd3fe2274a019994e5f5df37fcb3429112b3fca07bd1b9" Dec 02 09:55:06 crc kubenswrapper[4781]: E1202 09:55:06.226465 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86249df209579fc2a9dd3fe2274a019994e5f5df37fcb3429112b3fca07bd1b9\": container with ID starting with 86249df209579fc2a9dd3fe2274a019994e5f5df37fcb3429112b3fca07bd1b9 not found: ID does not exist" containerID="86249df209579fc2a9dd3fe2274a019994e5f5df37fcb3429112b3fca07bd1b9" Dec 02 09:55:06 crc kubenswrapper[4781]: I1202 09:55:06.226512 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86249df209579fc2a9dd3fe2274a019994e5f5df37fcb3429112b3fca07bd1b9"} err="failed to get container status \"86249df209579fc2a9dd3fe2274a019994e5f5df37fcb3429112b3fca07bd1b9\": rpc error: code = NotFound desc = could not find container \"86249df209579fc2a9dd3fe2274a019994e5f5df37fcb3429112b3fca07bd1b9\": container with ID starting with 86249df209579fc2a9dd3fe2274a019994e5f5df37fcb3429112b3fca07bd1b9 not found: ID does not exist" Dec 02 09:55:06 crc kubenswrapper[4781]: I1202 09:55:06.226546 4781 scope.go:117] "RemoveContainer" containerID="75c16bb8b86c860ba3a6dbda2c1f884b5dae83d5fcd264536c938155aa61b7a8" Dec 02 09:55:06 crc kubenswrapper[4781]: E1202 09:55:06.227040 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75c16bb8b86c860ba3a6dbda2c1f884b5dae83d5fcd264536c938155aa61b7a8\": container with ID starting with 75c16bb8b86c860ba3a6dbda2c1f884b5dae83d5fcd264536c938155aa61b7a8 not found: ID does not exist" containerID="75c16bb8b86c860ba3a6dbda2c1f884b5dae83d5fcd264536c938155aa61b7a8" Dec 02 09:55:06 crc kubenswrapper[4781]: I1202 09:55:06.227071 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75c16bb8b86c860ba3a6dbda2c1f884b5dae83d5fcd264536c938155aa61b7a8"} err="failed to get container status \"75c16bb8b86c860ba3a6dbda2c1f884b5dae83d5fcd264536c938155aa61b7a8\": rpc error: code = NotFound desc = could not find container \"75c16bb8b86c860ba3a6dbda2c1f884b5dae83d5fcd264536c938155aa61b7a8\": container with ID starting with 75c16bb8b86c860ba3a6dbda2c1f884b5dae83d5fcd264536c938155aa61b7a8 not found: ID does not exist" Dec 02 09:55:06 crc kubenswrapper[4781]: I1202 09:55:06.227092 4781 scope.go:117] "RemoveContainer" containerID="27d630f68eab5f05abb6a19fee812276154e64a860c5fb747d060e5b0a939587" Dec 02 09:55:06 crc kubenswrapper[4781]: E1202 09:55:06.227321 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27d630f68eab5f05abb6a19fee812276154e64a860c5fb747d060e5b0a939587\": container with ID starting with 27d630f68eab5f05abb6a19fee812276154e64a860c5fb747d060e5b0a939587 not found: ID does not exist" containerID="27d630f68eab5f05abb6a19fee812276154e64a860c5fb747d060e5b0a939587" Dec 02 09:55:06 crc kubenswrapper[4781]: I1202 09:55:06.227343 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27d630f68eab5f05abb6a19fee812276154e64a860c5fb747d060e5b0a939587"} err="failed to get container status \"27d630f68eab5f05abb6a19fee812276154e64a860c5fb747d060e5b0a939587\": rpc error: code = NotFound desc = could not find container \"27d630f68eab5f05abb6a19fee812276154e64a860c5fb747d060e5b0a939587\": container with ID starting with 27d630f68eab5f05abb6a19fee812276154e64a860c5fb747d060e5b0a939587 not found: ID does not exist" Dec 02 09:55:07 crc kubenswrapper[4781]: I1202 09:55:07.511202 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af2414f-0f9f-418b-b807-4362bf6ee700" path="/var/lib/kubelet/pods/2af2414f-0f9f-418b-b807-4362bf6ee700/volumes" Dec 02 09:55:17 crc kubenswrapper[4781]: I1202 09:55:17.223068 4781 generic.go:334] "Generic (PLEG): container finished" podID="8804a3f3-f23a-4a85-8e45-9f92f90c5e9b" containerID="b1443e03dcb72c5b40607b9cd99b32b21ac48a68dba31ff2666d8f5a1bfd24df" exitCode=0 Dec 02 09:55:17 crc kubenswrapper[4781]: I1202 09:55:17.223580 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" event={"ID":"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b","Type":"ContainerDied","Data":"b1443e03dcb72c5b40607b9cd99b32b21ac48a68dba31ff2666d8f5a1bfd24df"} Dec 02 09:55:18 crc kubenswrapper[4781]: I1202 09:55:18.645755 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" Dec 02 09:55:18 crc kubenswrapper[4781]: I1202 09:55:18.784360 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tngs\" (UniqueName: \"kubernetes.io/projected/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-kube-api-access-7tngs\") pod \"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b\" (UID: \"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b\") " Dec 02 09:55:18 crc kubenswrapper[4781]: I1202 09:55:18.784425 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-ssh-key\") pod \"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b\" (UID: \"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b\") " Dec 02 09:55:18 crc kubenswrapper[4781]: I1202 09:55:18.784472 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-inventory\") pod \"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b\" (UID: \"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b\") " Dec 02 09:55:18 crc kubenswrapper[4781]: I1202 09:55:18.796966 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-kube-api-access-7tngs" (OuterVolumeSpecName: "kube-api-access-7tngs") pod "8804a3f3-f23a-4a85-8e45-9f92f90c5e9b" (UID: "8804a3f3-f23a-4a85-8e45-9f92f90c5e9b"). InnerVolumeSpecName "kube-api-access-7tngs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:55:18 crc kubenswrapper[4781]: I1202 09:55:18.815121 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-inventory" (OuterVolumeSpecName: "inventory") pod "8804a3f3-f23a-4a85-8e45-9f92f90c5e9b" (UID: "8804a3f3-f23a-4a85-8e45-9f92f90c5e9b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:55:18 crc kubenswrapper[4781]: I1202 09:55:18.815218 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8804a3f3-f23a-4a85-8e45-9f92f90c5e9b" (UID: "8804a3f3-f23a-4a85-8e45-9f92f90c5e9b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:55:18 crc kubenswrapper[4781]: I1202 09:55:18.886159 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tngs\" (UniqueName: \"kubernetes.io/projected/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-kube-api-access-7tngs\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:18 crc kubenswrapper[4781]: I1202 09:55:18.886199 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:18 crc kubenswrapper[4781]: I1202 09:55:18.886209 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8804a3f3-f23a-4a85-8e45-9f92f90c5e9b-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.245790 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" event={"ID":"8804a3f3-f23a-4a85-8e45-9f92f90c5e9b","Type":"ContainerDied","Data":"9a552e3d8f6d4e8d778ee226c88a6f1074347b59fbd61c3f1c819ead4f1cbe46"} Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.246164 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a552e3d8f6d4e8d778ee226c88a6f1074347b59fbd61c3f1c819ead4f1cbe46" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.245872 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.321400 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445"] Dec 02 09:55:19 crc kubenswrapper[4781]: E1202 09:55:19.321881 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af2414f-0f9f-418b-b807-4362bf6ee700" containerName="registry-server" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.321904 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af2414f-0f9f-418b-b807-4362bf6ee700" containerName="registry-server" Dec 02 09:55:19 crc kubenswrapper[4781]: E1202 09:55:19.321949 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af2414f-0f9f-418b-b807-4362bf6ee700" containerName="extract-utilities" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.321958 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af2414f-0f9f-418b-b807-4362bf6ee700" containerName="extract-utilities" Dec 02 09:55:19 crc kubenswrapper[4781]: E1202 09:55:19.321989 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8804a3f3-f23a-4a85-8e45-9f92f90c5e9b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.322000 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8804a3f3-f23a-4a85-8e45-9f92f90c5e9b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 09:55:19 crc kubenswrapper[4781]: E1202 09:55:19.322027 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af2414f-0f9f-418b-b807-4362bf6ee700" containerName="extract-content" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.322034 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af2414f-0f9f-418b-b807-4362bf6ee700" containerName="extract-content" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.322246 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af2414f-0f9f-418b-b807-4362bf6ee700" containerName="registry-server" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.322267 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8804a3f3-f23a-4a85-8e45-9f92f90c5e9b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.323177 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.326163 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zl624" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.326315 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.326359 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.326700 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.332881 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445"] Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.496838 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm9xj\" (UniqueName: \"kubernetes.io/projected/321f18fc-759a-4eb7-bbb0-f230b7002932-kube-api-access-zm9xj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-95445\" (UID: \"321f18fc-759a-4eb7-bbb0-f230b7002932\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.496907 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/321f18fc-759a-4eb7-bbb0-f230b7002932-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-95445\" (UID: \"321f18fc-759a-4eb7-bbb0-f230b7002932\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.496986 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/321f18fc-759a-4eb7-bbb0-f230b7002932-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-95445\" (UID: \"321f18fc-759a-4eb7-bbb0-f230b7002932\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.598965 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/321f18fc-759a-4eb7-bbb0-f230b7002932-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-95445\" (UID: \"321f18fc-759a-4eb7-bbb0-f230b7002932\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.599034 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/321f18fc-759a-4eb7-bbb0-f230b7002932-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-95445\" (UID: \"321f18fc-759a-4eb7-bbb0-f230b7002932\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.599262 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm9xj\" (UniqueName: \"kubernetes.io/projected/321f18fc-759a-4eb7-bbb0-f230b7002932-kube-api-access-zm9xj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-95445\" (UID: \"321f18fc-759a-4eb7-bbb0-f230b7002932\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.602418 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/321f18fc-759a-4eb7-bbb0-f230b7002932-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-95445\" (UID: \"321f18fc-759a-4eb7-bbb0-f230b7002932\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.604473 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/321f18fc-759a-4eb7-bbb0-f230b7002932-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-95445\" (UID: \"321f18fc-759a-4eb7-bbb0-f230b7002932\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.622701 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm9xj\" (UniqueName: \"kubernetes.io/projected/321f18fc-759a-4eb7-bbb0-f230b7002932-kube-api-access-zm9xj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-95445\" (UID: \"321f18fc-759a-4eb7-bbb0-f230b7002932\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" Dec 02 09:55:19 crc kubenswrapper[4781]: I1202 09:55:19.639251 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" Dec 02 09:55:20 crc kubenswrapper[4781]: I1202 09:55:20.129947 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445"] Dec 02 09:55:20 crc kubenswrapper[4781]: I1202 09:55:20.255704 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" event={"ID":"321f18fc-759a-4eb7-bbb0-f230b7002932","Type":"ContainerStarted","Data":"a4f159c153f9e89b7cea78c4fb12978666f9bd1fa0acb5bb4d35b5bedf5ae735"} Dec 02 09:55:21 crc kubenswrapper[4781]: I1202 09:55:21.275319 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" event={"ID":"321f18fc-759a-4eb7-bbb0-f230b7002932","Type":"ContainerStarted","Data":"e41186aeac62d5fe86c7946f1824103b35ab5e4d43bd0b913dcb42beb206e028"} Dec 02 09:55:21 crc kubenswrapper[4781]: I1202 09:55:21.293629 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" podStartSLOduration=1.503879263 podStartE2EDuration="2.293613373s" podCreationTimestamp="2025-12-02 09:55:19 +0000 UTC" firstStartedPulling="2025-12-02 09:55:20.138032461 +0000 UTC m=+2082.961906330" lastFinishedPulling="2025-12-02 09:55:20.927766561 +0000 UTC m=+2083.751640440" observedRunningTime="2025-12-02 09:55:21.290064158 +0000 UTC m=+2084.113938057" watchObservedRunningTime="2025-12-02 09:55:21.293613373 +0000 UTC m=+2084.117487252" Dec 02 09:55:30 crc kubenswrapper[4781]: I1202 09:55:30.412384 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:55:30 crc kubenswrapper[4781]: I1202 09:55:30.412960 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:55:42 crc kubenswrapper[4781]: I1202 09:55:42.195731 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ps4r5"] Dec 02 09:55:42 crc kubenswrapper[4781]: I1202 09:55:42.198223 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ps4r5" Dec 02 09:55:42 crc kubenswrapper[4781]: I1202 09:55:42.207900 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ps4r5"] Dec 02 09:55:42 crc kubenswrapper[4781]: I1202 09:55:42.259478 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-utilities\") pod \"redhat-marketplace-ps4r5\" (UID: \"5fc81462-191b-4388-b34c-d1c9ce5dd5b7\") " pod="openshift-marketplace/redhat-marketplace-ps4r5" Dec 02 09:55:42 crc kubenswrapper[4781]: I1202 09:55:42.259899 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-catalog-content\") pod \"redhat-marketplace-ps4r5\" (UID: \"5fc81462-191b-4388-b34c-d1c9ce5dd5b7\") " pod="openshift-marketplace/redhat-marketplace-ps4r5" Dec 02 09:55:42 crc kubenswrapper[4781]: I1202 09:55:42.259957 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5cd9\" (UniqueName: \"kubernetes.io/projected/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-kube-api-access-x5cd9\") pod \"redhat-marketplace-ps4r5\" (UID: \"5fc81462-191b-4388-b34c-d1c9ce5dd5b7\") " pod="openshift-marketplace/redhat-marketplace-ps4r5" Dec 02 09:55:42 crc kubenswrapper[4781]: I1202 09:55:42.362082 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-catalog-content\") pod \"redhat-marketplace-ps4r5\" (UID: \"5fc81462-191b-4388-b34c-d1c9ce5dd5b7\") " pod="openshift-marketplace/redhat-marketplace-ps4r5" Dec 02 09:55:42 crc kubenswrapper[4781]: I1202 09:55:42.362132 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5cd9\" (UniqueName: \"kubernetes.io/projected/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-kube-api-access-x5cd9\") pod \"redhat-marketplace-ps4r5\" (UID: \"5fc81462-191b-4388-b34c-d1c9ce5dd5b7\") " pod="openshift-marketplace/redhat-marketplace-ps4r5" Dec 02 09:55:42 crc kubenswrapper[4781]: I1202 09:55:42.362213 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-utilities\") pod \"redhat-marketplace-ps4r5\" (UID: \"5fc81462-191b-4388-b34c-d1c9ce5dd5b7\") " pod="openshift-marketplace/redhat-marketplace-ps4r5" Dec 02 09:55:42 crc kubenswrapper[4781]: I1202 09:55:42.362648 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-catalog-content\") pod \"redhat-marketplace-ps4r5\" (UID: \"5fc81462-191b-4388-b34c-d1c9ce5dd5b7\") " pod="openshift-marketplace/redhat-marketplace-ps4r5" Dec 02 09:55:42 crc kubenswrapper[4781]: I1202 09:55:42.362776 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-utilities\") pod \"redhat-marketplace-ps4r5\" (UID: \"5fc81462-191b-4388-b34c-d1c9ce5dd5b7\") " pod="openshift-marketplace/redhat-marketplace-ps4r5" Dec 02 09:55:42 crc kubenswrapper[4781]: I1202 09:55:42.384870 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5cd9\" (UniqueName: \"kubernetes.io/projected/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-kube-api-access-x5cd9\") pod \"redhat-marketplace-ps4r5\" (UID: \"5fc81462-191b-4388-b34c-d1c9ce5dd5b7\") " pod="openshift-marketplace/redhat-marketplace-ps4r5" Dec 02 09:55:42 crc kubenswrapper[4781]: I1202 09:55:42.516160 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ps4r5" Dec 02 09:55:43 crc kubenswrapper[4781]: I1202 09:55:43.043253 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ps4r5"] Dec 02 09:55:43 crc kubenswrapper[4781]: W1202 09:55:43.048579 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fc81462_191b_4388_b34c_d1c9ce5dd5b7.slice/crio-316002bf36a8125a19b97db7d892a0d4e0f9ba624fde7389d68ef9a69a3a4e32 WatchSource:0}: Error finding container 316002bf36a8125a19b97db7d892a0d4e0f9ba624fde7389d68ef9a69a3a4e32: Status 404 returned error can't find the container with id 316002bf36a8125a19b97db7d892a0d4e0f9ba624fde7389d68ef9a69a3a4e32 Dec 02 09:55:43 crc kubenswrapper[4781]: I1202 09:55:43.472087 4781 generic.go:334] "Generic (PLEG): container finished" podID="5fc81462-191b-4388-b34c-d1c9ce5dd5b7" containerID="ce40f6bb85be225a7e37700aa6d0f05ead0b0f2cb0ee1c47b1b64fa690b0c28a" exitCode=0 Dec 02 09:55:43 crc kubenswrapper[4781]: I1202 09:55:43.472152 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ps4r5" event={"ID":"5fc81462-191b-4388-b34c-d1c9ce5dd5b7","Type":"ContainerDied","Data":"ce40f6bb85be225a7e37700aa6d0f05ead0b0f2cb0ee1c47b1b64fa690b0c28a"} Dec 02 09:55:43 crc kubenswrapper[4781]: I1202 09:55:43.472348 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ps4r5" event={"ID":"5fc81462-191b-4388-b34c-d1c9ce5dd5b7","Type":"ContainerStarted","Data":"316002bf36a8125a19b97db7d892a0d4e0f9ba624fde7389d68ef9a69a3a4e32"} Dec 02 09:55:43 crc kubenswrapper[4781]: I1202 09:55:43.534806 4781 scope.go:117] "RemoveContainer" containerID="4edc7af81473d46dc2e001614f6757ba9772e247b0e410a508c19e71f50f260e" Dec 02 09:55:43 crc kubenswrapper[4781]: I1202 09:55:43.575343 4781 scope.go:117] "RemoveContainer" containerID="c3513b69cc4c77c6c72076cad2760bf16cc2fb9525a398d12abefcdfdfd44aa7" Dec 02 09:55:43 crc kubenswrapper[4781]: I1202 09:55:43.625262 4781 scope.go:117] "RemoveContainer" containerID="544d424dc3bd43c5b3a55a41876f880fe1ebd6d93cf6b42073aad9a19a8c465a" Dec 02 09:55:45 crc kubenswrapper[4781]: I1202 09:55:45.489820 4781 generic.go:334] "Generic (PLEG): container finished" podID="5fc81462-191b-4388-b34c-d1c9ce5dd5b7" containerID="1a2c33c434a7d63128ad8034ce4a25a835f3323b46b4ce0a8855c2fb4d692a4e" exitCode=0 Dec 02 09:55:45 crc kubenswrapper[4781]: I1202 09:55:45.489880 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ps4r5" event={"ID":"5fc81462-191b-4388-b34c-d1c9ce5dd5b7","Type":"ContainerDied","Data":"1a2c33c434a7d63128ad8034ce4a25a835f3323b46b4ce0a8855c2fb4d692a4e"} Dec 02 09:55:46 crc kubenswrapper[4781]: I1202 09:55:46.501997 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ps4r5" event={"ID":"5fc81462-191b-4388-b34c-d1c9ce5dd5b7","Type":"ContainerStarted","Data":"f2afdd38f10a4aebf4c170ed3847ac5af68adf5d65896eceb7d2244be90898e7"} Dec 02 09:55:46 crc kubenswrapper[4781]: I1202 09:55:46.552687 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ps4r5" podStartSLOduration=1.98850282 podStartE2EDuration="4.552667564s" podCreationTimestamp="2025-12-02 09:55:42 +0000 UTC" firstStartedPulling="2025-12-02 09:55:43.47403218 +0000 UTC m=+2106.297906059" lastFinishedPulling="2025-12-02 09:55:46.038196914 +0000 UTC m=+2108.862070803" observedRunningTime="2025-12-02 09:55:46.539208371 +0000 UTC m=+2109.363082260" watchObservedRunningTime="2025-12-02 09:55:46.552667564 +0000 UTC m=+2109.376541443" Dec 02 09:55:50 crc kubenswrapper[4781]: I1202 09:55:50.041546 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-br5j5"] Dec 02 09:55:50 crc kubenswrapper[4781]: I1202 09:55:50.050909 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-br5j5"] Dec 02 09:55:51 crc kubenswrapper[4781]: I1202 09:55:51.036786 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-bq87j"] Dec 02 09:55:51 crc kubenswrapper[4781]: I1202 09:55:51.052417 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-bq87j"] Dec 02 09:55:51 crc kubenswrapper[4781]: I1202 09:55:51.514399 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51f721e2-e8fd-4ae1-89d9-fe7272e8246e" path="/var/lib/kubelet/pods/51f721e2-e8fd-4ae1-89d9-fe7272e8246e/volumes" Dec 02 09:55:51 crc kubenswrapper[4781]: I1202 09:55:51.515023 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9418aedd-84eb-4ff9-89f8-831695e5471e" path="/var/lib/kubelet/pods/9418aedd-84eb-4ff9-89f8-831695e5471e/volumes" Dec 02 09:55:52 crc kubenswrapper[4781]: I1202 09:55:52.025241 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-mfnct"] Dec 02 09:55:52 crc kubenswrapper[4781]: I1202 09:55:52.038660 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-mfnct"] Dec 02 09:55:52 crc kubenswrapper[4781]: I1202 09:55:52.516916 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ps4r5" Dec 02 09:55:52 crc kubenswrapper[4781]: I1202 09:55:52.516988 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ps4r5" Dec 02 09:55:52 crc kubenswrapper[4781]: I1202 09:55:52.574826 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ps4r5" Dec 02 09:55:52 crc kubenswrapper[4781]: I1202 09:55:52.637785 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ps4r5" Dec 02 09:55:52 crc kubenswrapper[4781]: I1202 09:55:52.810787 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ps4r5"] Dec 02 09:55:53 crc kubenswrapper[4781]: I1202 09:55:53.523430 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8e968b8-6eb0-41d7-beed-8b2bf7006359" path="/var/lib/kubelet/pods/c8e968b8-6eb0-41d7-beed-8b2bf7006359/volumes" Dec 02 09:55:54 crc kubenswrapper[4781]: I1202 09:55:54.573915 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ps4r5" podUID="5fc81462-191b-4388-b34c-d1c9ce5dd5b7" containerName="registry-server" containerID="cri-o://f2afdd38f10a4aebf4c170ed3847ac5af68adf5d65896eceb7d2244be90898e7" gracePeriod=2 Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.005810 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ps4r5" Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.102793 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-catalog-content\") pod \"5fc81462-191b-4388-b34c-d1c9ce5dd5b7\" (UID: \"5fc81462-191b-4388-b34c-d1c9ce5dd5b7\") " Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.110993 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5cd9\" (UniqueName: \"kubernetes.io/projected/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-kube-api-access-x5cd9\") pod \"5fc81462-191b-4388-b34c-d1c9ce5dd5b7\" (UID: \"5fc81462-191b-4388-b34c-d1c9ce5dd5b7\") " Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.111241 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-utilities\") pod \"5fc81462-191b-4388-b34c-d1c9ce5dd5b7\" (UID: \"5fc81462-191b-4388-b34c-d1c9ce5dd5b7\") " Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.113885 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-utilities" (OuterVolumeSpecName: "utilities") pod "5fc81462-191b-4388-b34c-d1c9ce5dd5b7" (UID: "5fc81462-191b-4388-b34c-d1c9ce5dd5b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.119902 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-kube-api-access-x5cd9" (OuterVolumeSpecName: "kube-api-access-x5cd9") pod "5fc81462-191b-4388-b34c-d1c9ce5dd5b7" (UID: "5fc81462-191b-4388-b34c-d1c9ce5dd5b7"). InnerVolumeSpecName "kube-api-access-x5cd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.124675 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5fc81462-191b-4388-b34c-d1c9ce5dd5b7" (UID: "5fc81462-191b-4388-b34c-d1c9ce5dd5b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.213214 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.213256 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.213267 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5cd9\" (UniqueName: \"kubernetes.io/projected/5fc81462-191b-4388-b34c-d1c9ce5dd5b7-kube-api-access-x5cd9\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.587369 4781 generic.go:334] "Generic (PLEG): container finished" podID="5fc81462-191b-4388-b34c-d1c9ce5dd5b7" containerID="f2afdd38f10a4aebf4c170ed3847ac5af68adf5d65896eceb7d2244be90898e7" exitCode=0 Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.587422 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ps4r5" Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.587416 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ps4r5" event={"ID":"5fc81462-191b-4388-b34c-d1c9ce5dd5b7","Type":"ContainerDied","Data":"f2afdd38f10a4aebf4c170ed3847ac5af68adf5d65896eceb7d2244be90898e7"} Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.587585 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ps4r5" event={"ID":"5fc81462-191b-4388-b34c-d1c9ce5dd5b7","Type":"ContainerDied","Data":"316002bf36a8125a19b97db7d892a0d4e0f9ba624fde7389d68ef9a69a3a4e32"} Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.587611 4781 scope.go:117] "RemoveContainer" containerID="f2afdd38f10a4aebf4c170ed3847ac5af68adf5d65896eceb7d2244be90898e7" Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.639984 4781 scope.go:117] "RemoveContainer" containerID="1a2c33c434a7d63128ad8034ce4a25a835f3323b46b4ce0a8855c2fb4d692a4e" Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.643979 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ps4r5"] Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.661847 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ps4r5"] Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.692825 4781 scope.go:117] "RemoveContainer" containerID="ce40f6bb85be225a7e37700aa6d0f05ead0b0f2cb0ee1c47b1b64fa690b0c28a" Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.742961 4781 scope.go:117] "RemoveContainer" containerID="f2afdd38f10a4aebf4c170ed3847ac5af68adf5d65896eceb7d2244be90898e7" Dec 02 09:55:55 crc kubenswrapper[4781]: E1202 09:55:55.743456 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2afdd38f10a4aebf4c170ed3847ac5af68adf5d65896eceb7d2244be90898e7\": container with ID starting with f2afdd38f10a4aebf4c170ed3847ac5af68adf5d65896eceb7d2244be90898e7 not found: ID does not exist" containerID="f2afdd38f10a4aebf4c170ed3847ac5af68adf5d65896eceb7d2244be90898e7" Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.743487 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2afdd38f10a4aebf4c170ed3847ac5af68adf5d65896eceb7d2244be90898e7"} err="failed to get container status \"f2afdd38f10a4aebf4c170ed3847ac5af68adf5d65896eceb7d2244be90898e7\": rpc error: code = NotFound desc = could not find container \"f2afdd38f10a4aebf4c170ed3847ac5af68adf5d65896eceb7d2244be90898e7\": container with ID starting with f2afdd38f10a4aebf4c170ed3847ac5af68adf5d65896eceb7d2244be90898e7 not found: ID does not exist" Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.743513 4781 scope.go:117] "RemoveContainer" containerID="1a2c33c434a7d63128ad8034ce4a25a835f3323b46b4ce0a8855c2fb4d692a4e" Dec 02 09:55:55 crc kubenswrapper[4781]: E1202 09:55:55.743831 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a2c33c434a7d63128ad8034ce4a25a835f3323b46b4ce0a8855c2fb4d692a4e\": container with ID starting with 1a2c33c434a7d63128ad8034ce4a25a835f3323b46b4ce0a8855c2fb4d692a4e not found: ID does not exist" containerID="1a2c33c434a7d63128ad8034ce4a25a835f3323b46b4ce0a8855c2fb4d692a4e" Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.743898 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a2c33c434a7d63128ad8034ce4a25a835f3323b46b4ce0a8855c2fb4d692a4e"} err="failed to get container status \"1a2c33c434a7d63128ad8034ce4a25a835f3323b46b4ce0a8855c2fb4d692a4e\": rpc error: code = NotFound desc = could not find container \"1a2c33c434a7d63128ad8034ce4a25a835f3323b46b4ce0a8855c2fb4d692a4e\": container with ID starting with 1a2c33c434a7d63128ad8034ce4a25a835f3323b46b4ce0a8855c2fb4d692a4e not found: ID does not exist" Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.744021 4781 scope.go:117] "RemoveContainer" containerID="ce40f6bb85be225a7e37700aa6d0f05ead0b0f2cb0ee1c47b1b64fa690b0c28a" Dec 02 09:55:55 crc kubenswrapper[4781]: E1202 09:55:55.744335 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce40f6bb85be225a7e37700aa6d0f05ead0b0f2cb0ee1c47b1b64fa690b0c28a\": container with ID starting with ce40f6bb85be225a7e37700aa6d0f05ead0b0f2cb0ee1c47b1b64fa690b0c28a not found: ID does not exist" containerID="ce40f6bb85be225a7e37700aa6d0f05ead0b0f2cb0ee1c47b1b64fa690b0c28a" Dec 02 09:55:55 crc kubenswrapper[4781]: I1202 09:55:55.744357 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce40f6bb85be225a7e37700aa6d0f05ead0b0f2cb0ee1c47b1b64fa690b0c28a"} err="failed to get container status \"ce40f6bb85be225a7e37700aa6d0f05ead0b0f2cb0ee1c47b1b64fa690b0c28a\": rpc error: code = NotFound desc = could not find container \"ce40f6bb85be225a7e37700aa6d0f05ead0b0f2cb0ee1c47b1b64fa690b0c28a\": container with ID starting with ce40f6bb85be225a7e37700aa6d0f05ead0b0f2cb0ee1c47b1b64fa690b0c28a not found: ID does not exist" Dec 02 09:55:57 crc kubenswrapper[4781]: I1202 09:55:57.517301 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fc81462-191b-4388-b34c-d1c9ce5dd5b7" path="/var/lib/kubelet/pods/5fc81462-191b-4388-b34c-d1c9ce5dd5b7/volumes" Dec 02 09:56:00 crc kubenswrapper[4781]: I1202 09:56:00.412636 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:56:00 crc kubenswrapper[4781]: I1202 09:56:00.413062 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:56:02 crc kubenswrapper[4781]: I1202 09:56:02.054501 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-dd8rn"] Dec 02 09:56:02 crc kubenswrapper[4781]: I1202 09:56:02.061422 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-6q5kh"] Dec 02 09:56:02 crc kubenswrapper[4781]: I1202 09:56:02.070603 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-dd8rn"] Dec 02 09:56:02 crc kubenswrapper[4781]: I1202 09:56:02.082087 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-6q5kh"] Dec 02 09:56:03 crc kubenswrapper[4781]: I1202 09:56:03.510684 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6140504f-30c8-4ec8-9659-fc4f2795c6a2" path="/var/lib/kubelet/pods/6140504f-30c8-4ec8-9659-fc4f2795c6a2/volumes" Dec 02 09:56:03 crc kubenswrapper[4781]: I1202 09:56:03.511737 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2718c51-467e-407a-99b1-266956fcacfa" path="/var/lib/kubelet/pods/e2718c51-467e-407a-99b1-266956fcacfa/volumes" Dec 02 09:56:04 crc kubenswrapper[4781]: I1202 09:56:04.038599 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-7h6r2"] Dec 02 09:56:04 crc kubenswrapper[4781]: I1202 09:56:04.046431 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-8af6-account-create-update-8c6xl"] Dec 02 09:56:04 crc kubenswrapper[4781]: I1202 09:56:04.059626 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1694-account-create-update-9dbhz"] Dec 02 09:56:04 crc kubenswrapper[4781]: I1202 09:56:04.068328 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-5a93-account-create-update-48m59"] Dec 02 09:56:04 crc kubenswrapper[4781]: I1202 09:56:04.076162 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-7h6r2"] Dec 02 09:56:04 crc kubenswrapper[4781]: I1202 09:56:04.083526 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-8af6-account-create-update-8c6xl"] Dec 02 09:56:04 crc kubenswrapper[4781]: I1202 09:56:04.091075 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-5a93-account-create-update-48m59"] Dec 02 09:56:04 crc kubenswrapper[4781]: I1202 09:56:04.098244 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-1694-account-create-update-9dbhz"] Dec 02 09:56:05 crc kubenswrapper[4781]: I1202 09:56:05.511660 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9416e5ad-d0d8-404a-8fe8-8a5648030542" path="/var/lib/kubelet/pods/9416e5ad-d0d8-404a-8fe8-8a5648030542/volumes" Dec 02 09:56:05 crc kubenswrapper[4781]: I1202 09:56:05.512493 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a18aa9-0687-4427-942c-d343a3f0c5f4" path="/var/lib/kubelet/pods/94a18aa9-0687-4427-942c-d343a3f0c5f4/volumes" Dec 02 09:56:05 crc kubenswrapper[4781]: I1202 09:56:05.513019 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1bda83d-ff07-4b0c-a978-a22373590d0f" path="/var/lib/kubelet/pods/a1bda83d-ff07-4b0c-a978-a22373590d0f/volumes" Dec 02 09:56:05 crc kubenswrapper[4781]: I1202 09:56:05.513569 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a89db488-2448-4a65-82ee-08472553ad38" path="/var/lib/kubelet/pods/a89db488-2448-4a65-82ee-08472553ad38/volumes" Dec 02 09:56:30 crc kubenswrapper[4781]: I1202 09:56:30.412046 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:56:30 crc kubenswrapper[4781]: I1202 09:56:30.412621 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:56:30 crc kubenswrapper[4781]: I1202 09:56:30.412670 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:56:30 crc kubenswrapper[4781]: I1202 09:56:30.413446 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"05bb140e8bed83dcb63b9b78720e5b065dcdd65808d271a4b06533687d8cc9f3"} pod="openshift-machine-config-operator/machine-config-daemon-pzntm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:56:30 crc kubenswrapper[4781]: I1202 09:56:30.413496 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" containerID="cri-o://05bb140e8bed83dcb63b9b78720e5b065dcdd65808d271a4b06533687d8cc9f3" gracePeriod=600 Dec 02 09:56:30 crc kubenswrapper[4781]: I1202 09:56:30.913970 4781 generic.go:334] "Generic (PLEG): container finished" podID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerID="05bb140e8bed83dcb63b9b78720e5b065dcdd65808d271a4b06533687d8cc9f3" exitCode=0 Dec 02 09:56:30 crc kubenswrapper[4781]: I1202 09:56:30.914039 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerDied","Data":"05bb140e8bed83dcb63b9b78720e5b065dcdd65808d271a4b06533687d8cc9f3"} Dec 02 09:56:30 crc kubenswrapper[4781]: I1202 09:56:30.914337 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d"} Dec 02 09:56:30 crc kubenswrapper[4781]: I1202 09:56:30.915025 4781 scope.go:117] "RemoveContainer" containerID="311057b514d104f4f4ba1bad8b86e944a9708a86065bf40bd45c3f422ad024ed" Dec 02 09:56:36 crc kubenswrapper[4781]: I1202 09:56:36.975195 4781 generic.go:334] "Generic (PLEG): container finished" podID="321f18fc-759a-4eb7-bbb0-f230b7002932" containerID="e41186aeac62d5fe86c7946f1824103b35ab5e4d43bd0b913dcb42beb206e028" exitCode=0 Dec 02 09:56:36 crc kubenswrapper[4781]: I1202 09:56:36.975261 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" event={"ID":"321f18fc-759a-4eb7-bbb0-f230b7002932","Type":"ContainerDied","Data":"e41186aeac62d5fe86c7946f1824103b35ab5e4d43bd0b913dcb42beb206e028"} Dec 02 09:56:38 crc kubenswrapper[4781]: I1202 09:56:38.389438 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" Dec 02 09:56:38 crc kubenswrapper[4781]: I1202 09:56:38.496559 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/321f18fc-759a-4eb7-bbb0-f230b7002932-inventory\") pod \"321f18fc-759a-4eb7-bbb0-f230b7002932\" (UID: \"321f18fc-759a-4eb7-bbb0-f230b7002932\") " Dec 02 09:56:38 crc kubenswrapper[4781]: I1202 09:56:38.497089 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm9xj\" (UniqueName: \"kubernetes.io/projected/321f18fc-759a-4eb7-bbb0-f230b7002932-kube-api-access-zm9xj\") pod \"321f18fc-759a-4eb7-bbb0-f230b7002932\" (UID: \"321f18fc-759a-4eb7-bbb0-f230b7002932\") " Dec 02 09:56:38 crc kubenswrapper[4781]: I1202 09:56:38.497130 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/321f18fc-759a-4eb7-bbb0-f230b7002932-ssh-key\") pod \"321f18fc-759a-4eb7-bbb0-f230b7002932\" (UID: \"321f18fc-759a-4eb7-bbb0-f230b7002932\") " Dec 02 09:56:38 crc kubenswrapper[4781]: I1202 09:56:38.503000 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/321f18fc-759a-4eb7-bbb0-f230b7002932-kube-api-access-zm9xj" (OuterVolumeSpecName: "kube-api-access-zm9xj") pod "321f18fc-759a-4eb7-bbb0-f230b7002932" (UID: "321f18fc-759a-4eb7-bbb0-f230b7002932"). InnerVolumeSpecName "kube-api-access-zm9xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:56:38 crc kubenswrapper[4781]: I1202 09:56:38.527236 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321f18fc-759a-4eb7-bbb0-f230b7002932-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "321f18fc-759a-4eb7-bbb0-f230b7002932" (UID: "321f18fc-759a-4eb7-bbb0-f230b7002932"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:56:38 crc kubenswrapper[4781]: I1202 09:56:38.528241 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321f18fc-759a-4eb7-bbb0-f230b7002932-inventory" (OuterVolumeSpecName: "inventory") pod "321f18fc-759a-4eb7-bbb0-f230b7002932" (UID: "321f18fc-759a-4eb7-bbb0-f230b7002932"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:56:38 crc kubenswrapper[4781]: I1202 09:56:38.601228 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/321f18fc-759a-4eb7-bbb0-f230b7002932-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:56:38 crc kubenswrapper[4781]: I1202 09:56:38.601270 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm9xj\" (UniqueName: \"kubernetes.io/projected/321f18fc-759a-4eb7-bbb0-f230b7002932-kube-api-access-zm9xj\") on node \"crc\" DevicePath \"\"" Dec 02 09:56:38 crc kubenswrapper[4781]: I1202 09:56:38.601285 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/321f18fc-759a-4eb7-bbb0-f230b7002932-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:56:38 crc kubenswrapper[4781]: I1202 09:56:38.992354 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" event={"ID":"321f18fc-759a-4eb7-bbb0-f230b7002932","Type":"ContainerDied","Data":"a4f159c153f9e89b7cea78c4fb12978666f9bd1fa0acb5bb4d35b5bedf5ae735"} Dec 02 09:56:38 crc kubenswrapper[4781]: I1202 09:56:38.992398 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4f159c153f9e89b7cea78c4fb12978666f9bd1fa0acb5bb4d35b5bedf5ae735" Dec 02 09:56:38 crc kubenswrapper[4781]: I1202 09:56:38.992414 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-95445" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.070061 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6"] Dec 02 09:56:39 crc kubenswrapper[4781]: E1202 09:56:39.070433 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc81462-191b-4388-b34c-d1c9ce5dd5b7" containerName="extract-content" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.070455 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc81462-191b-4388-b34c-d1c9ce5dd5b7" containerName="extract-content" Dec 02 09:56:39 crc kubenswrapper[4781]: E1202 09:56:39.070472 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc81462-191b-4388-b34c-d1c9ce5dd5b7" containerName="registry-server" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.070479 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc81462-191b-4388-b34c-d1c9ce5dd5b7" containerName="registry-server" Dec 02 09:56:39 crc kubenswrapper[4781]: E1202 09:56:39.070491 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="321f18fc-759a-4eb7-bbb0-f230b7002932" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.070500 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="321f18fc-759a-4eb7-bbb0-f230b7002932" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 09:56:39 crc kubenswrapper[4781]: E1202 09:56:39.070543 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc81462-191b-4388-b34c-d1c9ce5dd5b7" containerName="extract-utilities" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.070550 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc81462-191b-4388-b34c-d1c9ce5dd5b7" containerName="extract-utilities" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.070745 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="321f18fc-759a-4eb7-bbb0-f230b7002932" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.070765 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fc81462-191b-4388-b34c-d1c9ce5dd5b7" containerName="registry-server" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.071553 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.074260 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zl624" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.074512 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.074683 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.077744 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.082426 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6"] Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.109325 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd5hf\" (UniqueName: \"kubernetes.io/projected/1a663620-4120-46ee-9676-3eaac5534b99-kube-api-access-fd5hf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6\" (UID: \"1a663620-4120-46ee-9676-3eaac5534b99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.109403 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a663620-4120-46ee-9676-3eaac5534b99-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6\" (UID: \"1a663620-4120-46ee-9676-3eaac5534b99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.109430 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a663620-4120-46ee-9676-3eaac5534b99-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6\" (UID: \"1a663620-4120-46ee-9676-3eaac5534b99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.211136 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd5hf\" (UniqueName: \"kubernetes.io/projected/1a663620-4120-46ee-9676-3eaac5534b99-kube-api-access-fd5hf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6\" (UID: \"1a663620-4120-46ee-9676-3eaac5534b99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.211200 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a663620-4120-46ee-9676-3eaac5534b99-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6\" (UID: \"1a663620-4120-46ee-9676-3eaac5534b99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.211235 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a663620-4120-46ee-9676-3eaac5534b99-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6\" (UID: \"1a663620-4120-46ee-9676-3eaac5534b99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.218400 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a663620-4120-46ee-9676-3eaac5534b99-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6\" (UID: \"1a663620-4120-46ee-9676-3eaac5534b99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.222137 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a663620-4120-46ee-9676-3eaac5534b99-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6\" (UID: \"1a663620-4120-46ee-9676-3eaac5534b99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.228796 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd5hf\" (UniqueName: \"kubernetes.io/projected/1a663620-4120-46ee-9676-3eaac5534b99-kube-api-access-fd5hf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6\" (UID: \"1a663620-4120-46ee-9676-3eaac5534b99\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.389361 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" Dec 02 09:56:39 crc kubenswrapper[4781]: I1202 09:56:39.762522 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6"] Dec 02 09:56:40 crc kubenswrapper[4781]: I1202 09:56:40.003620 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" event={"ID":"1a663620-4120-46ee-9676-3eaac5534b99","Type":"ContainerStarted","Data":"4b68207a763b1dbdcc7753b89e239607d7ed21fd67dd9c27f0dcc91fba37a3d6"} Dec 02 09:56:41 crc kubenswrapper[4781]: I1202 09:56:41.013238 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" event={"ID":"1a663620-4120-46ee-9676-3eaac5534b99","Type":"ContainerStarted","Data":"9effe69b64cda0905b1d126f93336deec4eca0e894d680f6264877051c1c7188"} Dec 02 09:56:41 crc kubenswrapper[4781]: I1202 09:56:41.030903 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" podStartSLOduration=1.5650480500000001 podStartE2EDuration="2.030881892s" podCreationTimestamp="2025-12-02 09:56:39 +0000 UTC" firstStartedPulling="2025-12-02 09:56:39.770965459 +0000 UTC m=+2162.594839338" lastFinishedPulling="2025-12-02 09:56:40.236799301 +0000 UTC m=+2163.060673180" observedRunningTime="2025-12-02 09:56:41.02445628 +0000 UTC m=+2163.848330159" watchObservedRunningTime="2025-12-02 09:56:41.030881892 +0000 UTC m=+2163.854755771" Dec 02 09:56:43 crc kubenswrapper[4781]: I1202 09:56:43.749811 4781 scope.go:117] "RemoveContainer" containerID="31948cc1c7fdc3245b056d8db87415ad886b621d820d68ecbbf81b0725c2cd37" Dec 02 09:56:43 crc kubenswrapper[4781]: I1202 09:56:43.777064 4781 scope.go:117] "RemoveContainer" containerID="8bafab971d14d7d06b2d33414c80695d46381f1033247ba6a70d9c9f4e3f08d9" Dec 02 09:56:43 crc kubenswrapper[4781]: I1202 09:56:43.855342 4781 scope.go:117] "RemoveContainer" containerID="0e9d31f865f2281ef6607d410d491ab19970c696b96d06a61b5d1f515e6d68da" Dec 02 09:56:43 crc kubenswrapper[4781]: I1202 09:56:43.880494 4781 scope.go:117] "RemoveContainer" containerID="f98b7dc49839073c5670111481c853628d81568b06c050e36f0febeb821fa3ef" Dec 02 09:56:43 crc kubenswrapper[4781]: I1202 09:56:43.929819 4781 scope.go:117] "RemoveContainer" containerID="0c34967ec78eebb883230e5b26ffff48d377040317caae0756d0210120bfa3df" Dec 02 09:56:43 crc kubenswrapper[4781]: I1202 09:56:43.975047 4781 scope.go:117] "RemoveContainer" containerID="a07f2009cf58fc528ef5769303b895ed34faf022ea7551f54cb14cce5a190499" Dec 02 09:56:44 crc kubenswrapper[4781]: I1202 09:56:44.033302 4781 scope.go:117] "RemoveContainer" containerID="e8726982453fea62e8096c6eef8a1e6229c53551f4c77463f1662e601f13f38e" Dec 02 09:56:44 crc kubenswrapper[4781]: I1202 09:56:44.055463 4781 scope.go:117] "RemoveContainer" containerID="af8d6eab68d6b452cb1be0960bab4eb06d889f44fddc6f4cfbb8650f6a84a808" Dec 02 09:56:44 crc kubenswrapper[4781]: I1202 09:56:44.084880 4781 scope.go:117] "RemoveContainer" containerID="9dd76875a01226364271269d23abccb788961de68d9af76565e733a9a1496d2d" Dec 02 09:56:46 crc kubenswrapper[4781]: I1202 09:56:46.075458 4781 generic.go:334] "Generic (PLEG): container finished" podID="1a663620-4120-46ee-9676-3eaac5534b99" containerID="9effe69b64cda0905b1d126f93336deec4eca0e894d680f6264877051c1c7188" exitCode=0 Dec 02 09:56:46 crc kubenswrapper[4781]: I1202 09:56:46.075540 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" event={"ID":"1a663620-4120-46ee-9676-3eaac5534b99","Type":"ContainerDied","Data":"9effe69b64cda0905b1d126f93336deec4eca0e894d680f6264877051c1c7188"} Dec 02 09:56:47 crc kubenswrapper[4781]: I1202 09:56:47.476983 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" Dec 02 09:56:47 crc kubenswrapper[4781]: I1202 09:56:47.573633 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a663620-4120-46ee-9676-3eaac5534b99-inventory\") pod \"1a663620-4120-46ee-9676-3eaac5534b99\" (UID: \"1a663620-4120-46ee-9676-3eaac5534b99\") " Dec 02 09:56:47 crc kubenswrapper[4781]: I1202 09:56:47.573723 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd5hf\" (UniqueName: \"kubernetes.io/projected/1a663620-4120-46ee-9676-3eaac5534b99-kube-api-access-fd5hf\") pod \"1a663620-4120-46ee-9676-3eaac5534b99\" (UID: \"1a663620-4120-46ee-9676-3eaac5534b99\") " Dec 02 09:56:47 crc kubenswrapper[4781]: I1202 09:56:47.573917 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a663620-4120-46ee-9676-3eaac5534b99-ssh-key\") pod \"1a663620-4120-46ee-9676-3eaac5534b99\" (UID: \"1a663620-4120-46ee-9676-3eaac5534b99\") " Dec 02 09:56:47 crc kubenswrapper[4781]: I1202 09:56:47.579576 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a663620-4120-46ee-9676-3eaac5534b99-kube-api-access-fd5hf" (OuterVolumeSpecName: "kube-api-access-fd5hf") pod "1a663620-4120-46ee-9676-3eaac5534b99" (UID: "1a663620-4120-46ee-9676-3eaac5534b99"). InnerVolumeSpecName "kube-api-access-fd5hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:56:47 crc kubenswrapper[4781]: I1202 09:56:47.602702 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a663620-4120-46ee-9676-3eaac5534b99-inventory" (OuterVolumeSpecName: "inventory") pod "1a663620-4120-46ee-9676-3eaac5534b99" (UID: "1a663620-4120-46ee-9676-3eaac5534b99"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:56:47 crc kubenswrapper[4781]: I1202 09:56:47.607236 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a663620-4120-46ee-9676-3eaac5534b99-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1a663620-4120-46ee-9676-3eaac5534b99" (UID: "1a663620-4120-46ee-9676-3eaac5534b99"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:56:47 crc kubenswrapper[4781]: I1202 09:56:47.676471 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a663620-4120-46ee-9676-3eaac5534b99-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:56:47 crc kubenswrapper[4781]: I1202 09:56:47.676500 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a663620-4120-46ee-9676-3eaac5534b99-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:56:47 crc kubenswrapper[4781]: I1202 09:56:47.676510 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd5hf\" (UniqueName: \"kubernetes.io/projected/1a663620-4120-46ee-9676-3eaac5534b99-kube-api-access-fd5hf\") on node \"crc\" DevicePath \"\"" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.093629 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" event={"ID":"1a663620-4120-46ee-9676-3eaac5534b99","Type":"ContainerDied","Data":"4b68207a763b1dbdcc7753b89e239607d7ed21fd67dd9c27f0dcc91fba37a3d6"} Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.093674 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b68207a763b1dbdcc7753b89e239607d7ed21fd67dd9c27f0dcc91fba37a3d6" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.093733 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.204295 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8"] Dec 02 09:56:48 crc kubenswrapper[4781]: E1202 09:56:48.204811 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a663620-4120-46ee-9676-3eaac5534b99" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.204834 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a663620-4120-46ee-9676-3eaac5534b99" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.205119 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a663620-4120-46ee-9676-3eaac5534b99" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.205880 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.209292 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.209496 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zl624" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.209996 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.214341 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.225096 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8"] Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.287742 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2p8f8\" (UID: \"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.287864 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2p8f8\" (UID: \"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.287951 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwl4h\" (UniqueName: \"kubernetes.io/projected/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-kube-api-access-nwl4h\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2p8f8\" (UID: \"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.389744 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwl4h\" (UniqueName: \"kubernetes.io/projected/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-kube-api-access-nwl4h\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2p8f8\" (UID: \"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.389858 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2p8f8\" (UID: \"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.390027 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2p8f8\" (UID: \"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.394892 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2p8f8\" (UID: \"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.395184 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2p8f8\" (UID: \"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.405757 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwl4h\" (UniqueName: \"kubernetes.io/projected/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-kube-api-access-nwl4h\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2p8f8\" (UID: \"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" Dec 02 09:56:48 crc kubenswrapper[4781]: I1202 09:56:48.532251 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" Dec 02 09:56:49 crc kubenswrapper[4781]: I1202 09:56:49.016247 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8"] Dec 02 09:56:49 crc kubenswrapper[4781]: I1202 09:56:49.101599 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" event={"ID":"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d","Type":"ContainerStarted","Data":"6e174c0e20b88c11fdb5bc7f57e4f440d780ee339011af70ebfdfba0f06a77d1"} Dec 02 09:56:51 crc kubenswrapper[4781]: I1202 09:56:51.131165 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" event={"ID":"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d","Type":"ContainerStarted","Data":"339e286f77fb6e3168b262eb39c1b918f61649256cd6a226148d593d5081b556"} Dec 02 09:56:51 crc kubenswrapper[4781]: I1202 09:56:51.147948 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" podStartSLOduration=2.114038254 podStartE2EDuration="3.14791788s" podCreationTimestamp="2025-12-02 09:56:48 +0000 UTC" firstStartedPulling="2025-12-02 09:56:49.02216007 +0000 UTC m=+2171.846033949" lastFinishedPulling="2025-12-02 09:56:50.056039696 +0000 UTC m=+2172.879913575" observedRunningTime="2025-12-02 09:56:51.147734705 +0000 UTC m=+2173.971608584" watchObservedRunningTime="2025-12-02 09:56:51.14791788 +0000 UTC m=+2173.971791759" Dec 02 09:57:04 crc kubenswrapper[4781]: I1202 09:57:04.040807 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-972b4"] Dec 02 09:57:04 crc kubenswrapper[4781]: I1202 09:57:04.048803 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-972b4"] Dec 02 09:57:05 crc kubenswrapper[4781]: I1202 09:57:05.508563 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41de14f6-deed-478b-9d75-ad94ab88ee05" path="/var/lib/kubelet/pods/41de14f6-deed-478b-9d75-ad94ab88ee05/volumes" Dec 02 09:57:28 crc kubenswrapper[4781]: I1202 09:57:28.053422 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-fwtp2"] Dec 02 09:57:28 crc kubenswrapper[4781]: I1202 09:57:28.063493 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-fwtp2"] Dec 02 09:57:29 crc kubenswrapper[4781]: I1202 09:57:29.477080 4781 generic.go:334] "Generic (PLEG): container finished" podID="ea0d0dc8-72c3-42de-92b5-a98ad0417f6d" containerID="339e286f77fb6e3168b262eb39c1b918f61649256cd6a226148d593d5081b556" exitCode=0 Dec 02 09:57:29 crc kubenswrapper[4781]: I1202 09:57:29.477160 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" event={"ID":"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d","Type":"ContainerDied","Data":"339e286f77fb6e3168b262eb39c1b918f61649256cd6a226148d593d5081b556"} Dec 02 09:57:29 crc kubenswrapper[4781]: I1202 09:57:29.515186 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5c50078-d02c-43a0-83b1-d92b6f5a6e0e" path="/var/lib/kubelet/pods/f5c50078-d02c-43a0-83b1-d92b6f5a6e0e/volumes" Dec 02 09:57:30 crc kubenswrapper[4781]: I1202 09:57:30.895585 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" Dec 02 09:57:30 crc kubenswrapper[4781]: I1202 09:57:30.909494 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-ssh-key\") pod \"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d\" (UID: \"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d\") " Dec 02 09:57:30 crc kubenswrapper[4781]: I1202 09:57:30.909590 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwl4h\" (UniqueName: \"kubernetes.io/projected/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-kube-api-access-nwl4h\") pod \"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d\" (UID: \"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d\") " Dec 02 09:57:30 crc kubenswrapper[4781]: I1202 09:57:30.911078 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-inventory\") pod \"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d\" (UID: \"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d\") " Dec 02 09:57:30 crc kubenswrapper[4781]: I1202 09:57:30.947308 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-kube-api-access-nwl4h" (OuterVolumeSpecName: "kube-api-access-nwl4h") pod "ea0d0dc8-72c3-42de-92b5-a98ad0417f6d" (UID: "ea0d0dc8-72c3-42de-92b5-a98ad0417f6d"). InnerVolumeSpecName "kube-api-access-nwl4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:57:30 crc kubenswrapper[4781]: I1202 09:57:30.956026 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-inventory" (OuterVolumeSpecName: "inventory") pod "ea0d0dc8-72c3-42de-92b5-a98ad0417f6d" (UID: "ea0d0dc8-72c3-42de-92b5-a98ad0417f6d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:57:30 crc kubenswrapper[4781]: I1202 09:57:30.961047 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ea0d0dc8-72c3-42de-92b5-a98ad0417f6d" (UID: "ea0d0dc8-72c3-42de-92b5-a98ad0417f6d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.013994 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.014038 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwl4h\" (UniqueName: \"kubernetes.io/projected/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-kube-api-access-nwl4h\") on node \"crc\" DevicePath \"\"" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.014052 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea0d0dc8-72c3-42de-92b5-a98ad0417f6d-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.495288 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" event={"ID":"ea0d0dc8-72c3-42de-92b5-a98ad0417f6d","Type":"ContainerDied","Data":"6e174c0e20b88c11fdb5bc7f57e4f440d780ee339011af70ebfdfba0f06a77d1"} Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.495587 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e174c0e20b88c11fdb5bc7f57e4f440d780ee339011af70ebfdfba0f06a77d1" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.495323 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2p8f8" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.579868 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z"] Dec 02 09:57:31 crc kubenswrapper[4781]: E1202 09:57:31.580763 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0d0dc8-72c3-42de-92b5-a98ad0417f6d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.580863 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0d0dc8-72c3-42de-92b5-a98ad0417f6d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.581224 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea0d0dc8-72c3-42de-92b5-a98ad0417f6d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.582109 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.584615 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.585439 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.585639 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zl624" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.585768 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.597234 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z"] Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.624333 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fcef48b-9cfd-4f26-9964-3a083b035119-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z\" (UID: \"0fcef48b-9cfd-4f26-9964-3a083b035119\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.624468 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2x94\" (UniqueName: \"kubernetes.io/projected/0fcef48b-9cfd-4f26-9964-3a083b035119-kube-api-access-c2x94\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z\" (UID: \"0fcef48b-9cfd-4f26-9964-3a083b035119\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.624508 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fcef48b-9cfd-4f26-9964-3a083b035119-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z\" (UID: \"0fcef48b-9cfd-4f26-9964-3a083b035119\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.725244 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2x94\" (UniqueName: \"kubernetes.io/projected/0fcef48b-9cfd-4f26-9964-3a083b035119-kube-api-access-c2x94\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z\" (UID: \"0fcef48b-9cfd-4f26-9964-3a083b035119\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.725295 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fcef48b-9cfd-4f26-9964-3a083b035119-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z\" (UID: \"0fcef48b-9cfd-4f26-9964-3a083b035119\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.725384 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fcef48b-9cfd-4f26-9964-3a083b035119-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z\" (UID: \"0fcef48b-9cfd-4f26-9964-3a083b035119\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.729997 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fcef48b-9cfd-4f26-9964-3a083b035119-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z\" (UID: \"0fcef48b-9cfd-4f26-9964-3a083b035119\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.730243 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fcef48b-9cfd-4f26-9964-3a083b035119-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z\" (UID: \"0fcef48b-9cfd-4f26-9964-3a083b035119\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.744241 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2x94\" (UniqueName: \"kubernetes.io/projected/0fcef48b-9cfd-4f26-9964-3a083b035119-kube-api-access-c2x94\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z\" (UID: \"0fcef48b-9cfd-4f26-9964-3a083b035119\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" Dec 02 09:57:31 crc kubenswrapper[4781]: I1202 09:57:31.906960 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" Dec 02 09:57:32 crc kubenswrapper[4781]: I1202 09:57:32.467059 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z"] Dec 02 09:57:32 crc kubenswrapper[4781]: I1202 09:57:32.507164 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" event={"ID":"0fcef48b-9cfd-4f26-9964-3a083b035119","Type":"ContainerStarted","Data":"77edeb72236596e1f3a768e9bb0e175a12cce68331c88a07fbfcfd0517bceddf"} Dec 02 09:57:33 crc kubenswrapper[4781]: I1202 09:57:33.522905 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" event={"ID":"0fcef48b-9cfd-4f26-9964-3a083b035119","Type":"ContainerStarted","Data":"1cefba4bc2ef8a39b77c73d1a8b2ddc43360578c511cdca0abd8622d0b075e75"} Dec 02 09:57:33 crc kubenswrapper[4781]: I1202 09:57:33.539107 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" podStartSLOduration=1.9838188460000001 podStartE2EDuration="2.539091728s" podCreationTimestamp="2025-12-02 09:57:31 +0000 UTC" firstStartedPulling="2025-12-02 09:57:32.468913033 +0000 UTC m=+2215.292786902" lastFinishedPulling="2025-12-02 09:57:33.024185895 +0000 UTC m=+2215.848059784" observedRunningTime="2025-12-02 09:57:33.538559324 +0000 UTC m=+2216.362433203" watchObservedRunningTime="2025-12-02 09:57:33.539091728 +0000 UTC m=+2216.362965607" Dec 02 09:57:35 crc kubenswrapper[4781]: I1202 09:57:35.032943 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qftr6"] Dec 02 09:57:35 crc kubenswrapper[4781]: I1202 09:57:35.050746 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qftr6"] Dec 02 09:57:35 crc kubenswrapper[4781]: I1202 09:57:35.510748 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9215fee-a493-48f3-a67c-21fbb4256f55" path="/var/lib/kubelet/pods/a9215fee-a493-48f3-a67c-21fbb4256f55/volumes" Dec 02 09:57:44 crc kubenswrapper[4781]: I1202 09:57:44.291956 4781 scope.go:117] "RemoveContainer" containerID="65ce496472d4caa27e5c6be55c9ff97db07abac06c00e5140f86690226b00b55" Dec 02 09:57:44 crc kubenswrapper[4781]: I1202 09:57:44.346912 4781 scope.go:117] "RemoveContainer" containerID="45532ac872502b7cd761561dde6edf9b00f15e4588cc2b06e6896031fbbb55ce" Dec 02 09:57:44 crc kubenswrapper[4781]: I1202 09:57:44.389834 4781 scope.go:117] "RemoveContainer" containerID="c012f2716641543f3e3e57641a96191662a00eb552327ca589e567ba6003f551" Dec 02 09:57:52 crc kubenswrapper[4781]: I1202 09:57:52.744853 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2rz4l"] Dec 02 09:57:52 crc kubenswrapper[4781]: I1202 09:57:52.750554 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2rz4l" Dec 02 09:57:52 crc kubenswrapper[4781]: I1202 09:57:52.754076 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2rz4l"] Dec 02 09:57:52 crc kubenswrapper[4781]: I1202 09:57:52.804637 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpdrq\" (UniqueName: \"kubernetes.io/projected/bdef86d0-746d-496a-bb85-9f107917bdb2-kube-api-access-rpdrq\") pod \"redhat-operators-2rz4l\" (UID: \"bdef86d0-746d-496a-bb85-9f107917bdb2\") " pod="openshift-marketplace/redhat-operators-2rz4l" Dec 02 09:57:52 crc kubenswrapper[4781]: I1202 09:57:52.804720 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdef86d0-746d-496a-bb85-9f107917bdb2-utilities\") pod \"redhat-operators-2rz4l\" (UID: \"bdef86d0-746d-496a-bb85-9f107917bdb2\") " pod="openshift-marketplace/redhat-operators-2rz4l" Dec 02 09:57:52 crc kubenswrapper[4781]: I1202 09:57:52.805073 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdef86d0-746d-496a-bb85-9f107917bdb2-catalog-content\") pod \"redhat-operators-2rz4l\" (UID: \"bdef86d0-746d-496a-bb85-9f107917bdb2\") " pod="openshift-marketplace/redhat-operators-2rz4l" Dec 02 09:57:52 crc kubenswrapper[4781]: I1202 09:57:52.907146 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpdrq\" (UniqueName: \"kubernetes.io/projected/bdef86d0-746d-496a-bb85-9f107917bdb2-kube-api-access-rpdrq\") pod \"redhat-operators-2rz4l\" (UID: \"bdef86d0-746d-496a-bb85-9f107917bdb2\") " pod="openshift-marketplace/redhat-operators-2rz4l" Dec 02 09:57:52 crc kubenswrapper[4781]: I1202 09:57:52.907220 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdef86d0-746d-496a-bb85-9f107917bdb2-utilities\") pod \"redhat-operators-2rz4l\" (UID: \"bdef86d0-746d-496a-bb85-9f107917bdb2\") " pod="openshift-marketplace/redhat-operators-2rz4l" Dec 02 09:57:52 crc kubenswrapper[4781]: I1202 09:57:52.907279 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdef86d0-746d-496a-bb85-9f107917bdb2-catalog-content\") pod \"redhat-operators-2rz4l\" (UID: \"bdef86d0-746d-496a-bb85-9f107917bdb2\") " pod="openshift-marketplace/redhat-operators-2rz4l" Dec 02 09:57:52 crc kubenswrapper[4781]: I1202 09:57:52.908224 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdef86d0-746d-496a-bb85-9f107917bdb2-catalog-content\") pod \"redhat-operators-2rz4l\" (UID: \"bdef86d0-746d-496a-bb85-9f107917bdb2\") " pod="openshift-marketplace/redhat-operators-2rz4l" Dec 02 09:57:52 crc kubenswrapper[4781]: I1202 09:57:52.908365 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdef86d0-746d-496a-bb85-9f107917bdb2-utilities\") pod \"redhat-operators-2rz4l\" (UID: \"bdef86d0-746d-496a-bb85-9f107917bdb2\") " pod="openshift-marketplace/redhat-operators-2rz4l" Dec 02 09:57:52 crc kubenswrapper[4781]: I1202 09:57:52.932818 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpdrq\" (UniqueName: \"kubernetes.io/projected/bdef86d0-746d-496a-bb85-9f107917bdb2-kube-api-access-rpdrq\") pod \"redhat-operators-2rz4l\" (UID: \"bdef86d0-746d-496a-bb85-9f107917bdb2\") " pod="openshift-marketplace/redhat-operators-2rz4l" Dec 02 09:57:53 crc kubenswrapper[4781]: I1202 09:57:53.076670 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2rz4l" Dec 02 09:57:53 crc kubenswrapper[4781]: I1202 09:57:53.559230 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2rz4l"] Dec 02 09:57:53 crc kubenswrapper[4781]: W1202 09:57:53.563160 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdef86d0_746d_496a_bb85_9f107917bdb2.slice/crio-2b2f1e4f4d30cfcf756a3a73cc7e10d024e688b1291c6d355ba06e7296235716 WatchSource:0}: Error finding container 2b2f1e4f4d30cfcf756a3a73cc7e10d024e688b1291c6d355ba06e7296235716: Status 404 returned error can't find the container with id 2b2f1e4f4d30cfcf756a3a73cc7e10d024e688b1291c6d355ba06e7296235716 Dec 02 09:57:53 crc kubenswrapper[4781]: I1202 09:57:53.723508 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2rz4l" event={"ID":"bdef86d0-746d-496a-bb85-9f107917bdb2","Type":"ContainerStarted","Data":"2b2f1e4f4d30cfcf756a3a73cc7e10d024e688b1291c6d355ba06e7296235716"} Dec 02 09:57:54 crc kubenswrapper[4781]: I1202 09:57:54.735096 4781 generic.go:334] "Generic (PLEG): container finished" podID="bdef86d0-746d-496a-bb85-9f107917bdb2" containerID="6acacb9ec6f1d9bfa9e075a19cc6321bc8687cf3b08b9c5116e34eab9fb6d894" exitCode=0 Dec 02 09:57:54 crc kubenswrapper[4781]: I1202 09:57:54.735226 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2rz4l" event={"ID":"bdef86d0-746d-496a-bb85-9f107917bdb2","Type":"ContainerDied","Data":"6acacb9ec6f1d9bfa9e075a19cc6321bc8687cf3b08b9c5116e34eab9fb6d894"} Dec 02 09:57:54 crc kubenswrapper[4781]: I1202 09:57:54.741834 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hn9h4"] Dec 02 09:57:54 crc kubenswrapper[4781]: I1202 09:57:54.744428 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hn9h4" Dec 02 09:57:54 crc kubenswrapper[4781]: I1202 09:57:54.753506 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hn9h4"] Dec 02 09:57:54 crc kubenswrapper[4781]: I1202 09:57:54.942789 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jdhj\" (UniqueName: \"kubernetes.io/projected/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-kube-api-access-5jdhj\") pod \"certified-operators-hn9h4\" (UID: \"305a2fb1-c716-4f4f-9168-03df5cf5ff6a\") " pod="openshift-marketplace/certified-operators-hn9h4" Dec 02 09:57:54 crc kubenswrapper[4781]: I1202 09:57:54.942834 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-catalog-content\") pod \"certified-operators-hn9h4\" (UID: \"305a2fb1-c716-4f4f-9168-03df5cf5ff6a\") " pod="openshift-marketplace/certified-operators-hn9h4" Dec 02 09:57:54 crc kubenswrapper[4781]: I1202 09:57:54.942895 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-utilities\") pod \"certified-operators-hn9h4\" (UID: \"305a2fb1-c716-4f4f-9168-03df5cf5ff6a\") " pod="openshift-marketplace/certified-operators-hn9h4" Dec 02 09:57:55 crc kubenswrapper[4781]: I1202 09:57:55.044009 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jdhj\" (UniqueName: \"kubernetes.io/projected/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-kube-api-access-5jdhj\") pod \"certified-operators-hn9h4\" (UID: \"305a2fb1-c716-4f4f-9168-03df5cf5ff6a\") " pod="openshift-marketplace/certified-operators-hn9h4" Dec 02 09:57:55 crc kubenswrapper[4781]: I1202 09:57:55.044062 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-catalog-content\") pod \"certified-operators-hn9h4\" (UID: \"305a2fb1-c716-4f4f-9168-03df5cf5ff6a\") " pod="openshift-marketplace/certified-operators-hn9h4" Dec 02 09:57:55 crc kubenswrapper[4781]: I1202 09:57:55.044135 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-utilities\") pod \"certified-operators-hn9h4\" (UID: \"305a2fb1-c716-4f4f-9168-03df5cf5ff6a\") " pod="openshift-marketplace/certified-operators-hn9h4" Dec 02 09:57:55 crc kubenswrapper[4781]: I1202 09:57:55.044587 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-utilities\") pod \"certified-operators-hn9h4\" (UID: \"305a2fb1-c716-4f4f-9168-03df5cf5ff6a\") " pod="openshift-marketplace/certified-operators-hn9h4" Dec 02 09:57:55 crc kubenswrapper[4781]: I1202 09:57:55.045187 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-catalog-content\") pod \"certified-operators-hn9h4\" (UID: \"305a2fb1-c716-4f4f-9168-03df5cf5ff6a\") " pod="openshift-marketplace/certified-operators-hn9h4" Dec 02 09:57:55 crc kubenswrapper[4781]: I1202 09:57:55.070247 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jdhj\" (UniqueName: \"kubernetes.io/projected/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-kube-api-access-5jdhj\") pod \"certified-operators-hn9h4\" (UID: \"305a2fb1-c716-4f4f-9168-03df5cf5ff6a\") " pod="openshift-marketplace/certified-operators-hn9h4" Dec 02 09:57:55 crc kubenswrapper[4781]: I1202 09:57:55.073462 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hn9h4" Dec 02 09:57:55 crc kubenswrapper[4781]: I1202 09:57:55.638447 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hn9h4"] Dec 02 09:57:55 crc kubenswrapper[4781]: W1202 09:57:55.638883 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod305a2fb1_c716_4f4f_9168_03df5cf5ff6a.slice/crio-c0af9f29325f4a6b46fc4d866ff5416b48aea5fe3a1281863160c6745147811c WatchSource:0}: Error finding container c0af9f29325f4a6b46fc4d866ff5416b48aea5fe3a1281863160c6745147811c: Status 404 returned error can't find the container with id c0af9f29325f4a6b46fc4d866ff5416b48aea5fe3a1281863160c6745147811c Dec 02 09:57:55 crc kubenswrapper[4781]: I1202 09:57:55.744359 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hn9h4" event={"ID":"305a2fb1-c716-4f4f-9168-03df5cf5ff6a","Type":"ContainerStarted","Data":"c0af9f29325f4a6b46fc4d866ff5416b48aea5fe3a1281863160c6745147811c"} Dec 02 09:57:56 crc kubenswrapper[4781]: I1202 09:57:56.753638 4781 generic.go:334] "Generic (PLEG): container finished" podID="305a2fb1-c716-4f4f-9168-03df5cf5ff6a" containerID="f5c382d6189da44643e82dec7cdac70bab62b6478f998c4341235a34ca41857d" exitCode=0 Dec 02 09:57:56 crc kubenswrapper[4781]: I1202 09:57:56.753684 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hn9h4" event={"ID":"305a2fb1-c716-4f4f-9168-03df5cf5ff6a","Type":"ContainerDied","Data":"f5c382d6189da44643e82dec7cdac70bab62b6478f998c4341235a34ca41857d"} Dec 02 09:57:56 crc kubenswrapper[4781]: I1202 09:57:56.758240 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2rz4l" event={"ID":"bdef86d0-746d-496a-bb85-9f107917bdb2","Type":"ContainerStarted","Data":"f115627be11d9d8b0d39a606d981e7e7efe7a655fadb57a08f5db823f52d202e"} Dec 02 09:57:58 crc kubenswrapper[4781]: I1202 09:57:58.779032 4781 generic.go:334] "Generic (PLEG): container finished" podID="bdef86d0-746d-496a-bb85-9f107917bdb2" containerID="f115627be11d9d8b0d39a606d981e7e7efe7a655fadb57a08f5db823f52d202e" exitCode=0 Dec 02 09:57:58 crc kubenswrapper[4781]: I1202 09:57:58.779086 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2rz4l" event={"ID":"bdef86d0-746d-496a-bb85-9f107917bdb2","Type":"ContainerDied","Data":"f115627be11d9d8b0d39a606d981e7e7efe7a655fadb57a08f5db823f52d202e"} Dec 02 09:57:59 crc kubenswrapper[4781]: I1202 09:57:59.792960 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hn9h4" event={"ID":"305a2fb1-c716-4f4f-9168-03df5cf5ff6a","Type":"ContainerStarted","Data":"ddbe2573605c1cdb12ef8708bea1ab474d287742b08404244aa06a7b61cf289f"} Dec 02 09:58:00 crc kubenswrapper[4781]: I1202 09:58:00.815674 4781 generic.go:334] "Generic (PLEG): container finished" podID="305a2fb1-c716-4f4f-9168-03df5cf5ff6a" containerID="ddbe2573605c1cdb12ef8708bea1ab474d287742b08404244aa06a7b61cf289f" exitCode=0 Dec 02 09:58:00 crc kubenswrapper[4781]: I1202 09:58:00.815718 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hn9h4" event={"ID":"305a2fb1-c716-4f4f-9168-03df5cf5ff6a","Type":"ContainerDied","Data":"ddbe2573605c1cdb12ef8708bea1ab474d287742b08404244aa06a7b61cf289f"} Dec 02 09:58:00 crc kubenswrapper[4781]: I1202 09:58:00.824804 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2rz4l" event={"ID":"bdef86d0-746d-496a-bb85-9f107917bdb2","Type":"ContainerStarted","Data":"0bc307022a6aad52f346769553f57cf5d0c443b02607ca9ddd950e8097d843a5"} Dec 02 09:58:00 crc kubenswrapper[4781]: I1202 09:58:00.852833 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2rz4l" podStartSLOduration=3.864066142 podStartE2EDuration="8.852812683s" podCreationTimestamp="2025-12-02 09:57:52 +0000 UTC" firstStartedPulling="2025-12-02 09:57:54.737475251 +0000 UTC m=+2237.561349130" lastFinishedPulling="2025-12-02 09:57:59.726221792 +0000 UTC m=+2242.550095671" observedRunningTime="2025-12-02 09:58:00.848870178 +0000 UTC m=+2243.672744057" watchObservedRunningTime="2025-12-02 09:58:00.852812683 +0000 UTC m=+2243.676686562" Dec 02 09:58:02 crc kubenswrapper[4781]: I1202 09:58:02.842022 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hn9h4" event={"ID":"305a2fb1-c716-4f4f-9168-03df5cf5ff6a","Type":"ContainerStarted","Data":"811b7f0996ab9579be7dc6ad3b664c6509eaeabb73b52a7b096d86ec4c05d07e"} Dec 02 09:58:02 crc kubenswrapper[4781]: I1202 09:58:02.863651 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hn9h4" podStartSLOduration=3.904431591 podStartE2EDuration="8.863630733s" podCreationTimestamp="2025-12-02 09:57:54 +0000 UTC" firstStartedPulling="2025-12-02 09:57:56.755362789 +0000 UTC m=+2239.579236668" lastFinishedPulling="2025-12-02 09:58:01.714561931 +0000 UTC m=+2244.538435810" observedRunningTime="2025-12-02 09:58:02.862585845 +0000 UTC m=+2245.686459724" watchObservedRunningTime="2025-12-02 09:58:02.863630733 +0000 UTC m=+2245.687504612" Dec 02 09:58:03 crc kubenswrapper[4781]: I1202 09:58:03.077529 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2rz4l" Dec 02 09:58:03 crc kubenswrapper[4781]: I1202 09:58:03.077896 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2rz4l" Dec 02 09:58:04 crc kubenswrapper[4781]: I1202 09:58:04.140044 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2rz4l" podUID="bdef86d0-746d-496a-bb85-9f107917bdb2" containerName="registry-server" probeResult="failure" output=< Dec 02 09:58:04 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Dec 02 09:58:04 crc kubenswrapper[4781]: > Dec 02 09:58:05 crc kubenswrapper[4781]: I1202 09:58:05.074162 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hn9h4" Dec 02 09:58:05 crc kubenswrapper[4781]: I1202 09:58:05.074569 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hn9h4" Dec 02 09:58:05 crc kubenswrapper[4781]: I1202 09:58:05.127212 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hn9h4" Dec 02 09:58:13 crc kubenswrapper[4781]: I1202 09:58:13.122938 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2rz4l" Dec 02 09:58:13 crc kubenswrapper[4781]: I1202 09:58:13.171868 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2rz4l" Dec 02 09:58:13 crc kubenswrapper[4781]: I1202 09:58:13.354786 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2rz4l"] Dec 02 09:58:14 crc kubenswrapper[4781]: I1202 09:58:14.059547 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mhsth"] Dec 02 09:58:14 crc kubenswrapper[4781]: I1202 09:58:14.068301 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mhsth"] Dec 02 09:58:14 crc kubenswrapper[4781]: I1202 09:58:14.968995 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2rz4l" podUID="bdef86d0-746d-496a-bb85-9f107917bdb2" containerName="registry-server" containerID="cri-o://0bc307022a6aad52f346769553f57cf5d0c443b02607ca9ddd950e8097d843a5" gracePeriod=2 Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.127820 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hn9h4" Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.477810 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2rz4l" Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.515960 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e7e36fe-b5d1-4dfa-827a-0bea1b098c07" path="/var/lib/kubelet/pods/7e7e36fe-b5d1-4dfa-827a-0bea1b098c07/volumes" Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.551593 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpdrq\" (UniqueName: \"kubernetes.io/projected/bdef86d0-746d-496a-bb85-9f107917bdb2-kube-api-access-rpdrq\") pod \"bdef86d0-746d-496a-bb85-9f107917bdb2\" (UID: \"bdef86d0-746d-496a-bb85-9f107917bdb2\") " Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.551669 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdef86d0-746d-496a-bb85-9f107917bdb2-utilities\") pod \"bdef86d0-746d-496a-bb85-9f107917bdb2\" (UID: \"bdef86d0-746d-496a-bb85-9f107917bdb2\") " Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.551878 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdef86d0-746d-496a-bb85-9f107917bdb2-catalog-content\") pod \"bdef86d0-746d-496a-bb85-9f107917bdb2\" (UID: \"bdef86d0-746d-496a-bb85-9f107917bdb2\") " Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.552978 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdef86d0-746d-496a-bb85-9f107917bdb2-utilities" (OuterVolumeSpecName: "utilities") pod "bdef86d0-746d-496a-bb85-9f107917bdb2" (UID: "bdef86d0-746d-496a-bb85-9f107917bdb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.562202 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdef86d0-746d-496a-bb85-9f107917bdb2-kube-api-access-rpdrq" (OuterVolumeSpecName: "kube-api-access-rpdrq") pod "bdef86d0-746d-496a-bb85-9f107917bdb2" (UID: "bdef86d0-746d-496a-bb85-9f107917bdb2"). InnerVolumeSpecName "kube-api-access-rpdrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.653850 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpdrq\" (UniqueName: \"kubernetes.io/projected/bdef86d0-746d-496a-bb85-9f107917bdb2-kube-api-access-rpdrq\") on node \"crc\" DevicePath \"\"" Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.653886 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdef86d0-746d-496a-bb85-9f107917bdb2-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.675692 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdef86d0-746d-496a-bb85-9f107917bdb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdef86d0-746d-496a-bb85-9f107917bdb2" (UID: "bdef86d0-746d-496a-bb85-9f107917bdb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.755315 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdef86d0-746d-496a-bb85-9f107917bdb2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.955510 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hn9h4"] Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.978404 4781 generic.go:334] "Generic (PLEG): container finished" podID="bdef86d0-746d-496a-bb85-9f107917bdb2" containerID="0bc307022a6aad52f346769553f57cf5d0c443b02607ca9ddd950e8097d843a5" exitCode=0 Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.978472 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2rz4l" event={"ID":"bdef86d0-746d-496a-bb85-9f107917bdb2","Type":"ContainerDied","Data":"0bc307022a6aad52f346769553f57cf5d0c443b02607ca9ddd950e8097d843a5"} Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.978516 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2rz4l" event={"ID":"bdef86d0-746d-496a-bb85-9f107917bdb2","Type":"ContainerDied","Data":"2b2f1e4f4d30cfcf756a3a73cc7e10d024e688b1291c6d355ba06e7296235716"} Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.978514 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2rz4l" Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.978558 4781 scope.go:117] "RemoveContainer" containerID="0bc307022a6aad52f346769553f57cf5d0c443b02607ca9ddd950e8097d843a5" Dec 02 09:58:15 crc kubenswrapper[4781]: I1202 09:58:15.978623 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hn9h4" podUID="305a2fb1-c716-4f4f-9168-03df5cf5ff6a" containerName="registry-server" containerID="cri-o://811b7f0996ab9579be7dc6ad3b664c6509eaeabb73b52a7b096d86ec4c05d07e" gracePeriod=2 Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.001658 4781 scope.go:117] "RemoveContainer" containerID="f115627be11d9d8b0d39a606d981e7e7efe7a655fadb57a08f5db823f52d202e" Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.012000 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2rz4l"] Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.019345 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2rz4l"] Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.041207 4781 scope.go:117] "RemoveContainer" containerID="6acacb9ec6f1d9bfa9e075a19cc6321bc8687cf3b08b9c5116e34eab9fb6d894" Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.190752 4781 scope.go:117] "RemoveContainer" containerID="0bc307022a6aad52f346769553f57cf5d0c443b02607ca9ddd950e8097d843a5" Dec 02 09:58:16 crc kubenswrapper[4781]: E1202 09:58:16.191333 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bc307022a6aad52f346769553f57cf5d0c443b02607ca9ddd950e8097d843a5\": container with ID starting with 0bc307022a6aad52f346769553f57cf5d0c443b02607ca9ddd950e8097d843a5 not found: ID does not exist" containerID="0bc307022a6aad52f346769553f57cf5d0c443b02607ca9ddd950e8097d843a5" Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.191402 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bc307022a6aad52f346769553f57cf5d0c443b02607ca9ddd950e8097d843a5"} err="failed to get container status \"0bc307022a6aad52f346769553f57cf5d0c443b02607ca9ddd950e8097d843a5\": rpc error: code = NotFound desc = could not find container \"0bc307022a6aad52f346769553f57cf5d0c443b02607ca9ddd950e8097d843a5\": container with ID starting with 0bc307022a6aad52f346769553f57cf5d0c443b02607ca9ddd950e8097d843a5 not found: ID does not exist" Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.191426 4781 scope.go:117] "RemoveContainer" containerID="f115627be11d9d8b0d39a606d981e7e7efe7a655fadb57a08f5db823f52d202e" Dec 02 09:58:16 crc kubenswrapper[4781]: E1202 09:58:16.191937 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f115627be11d9d8b0d39a606d981e7e7efe7a655fadb57a08f5db823f52d202e\": container with ID starting with f115627be11d9d8b0d39a606d981e7e7efe7a655fadb57a08f5db823f52d202e not found: ID does not exist" containerID="f115627be11d9d8b0d39a606d981e7e7efe7a655fadb57a08f5db823f52d202e" Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.191968 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f115627be11d9d8b0d39a606d981e7e7efe7a655fadb57a08f5db823f52d202e"} err="failed to get container status \"f115627be11d9d8b0d39a606d981e7e7efe7a655fadb57a08f5db823f52d202e\": rpc error: code = NotFound desc = could not find container \"f115627be11d9d8b0d39a606d981e7e7efe7a655fadb57a08f5db823f52d202e\": container with ID starting with f115627be11d9d8b0d39a606d981e7e7efe7a655fadb57a08f5db823f52d202e not found: ID does not exist" Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.191985 4781 scope.go:117] "RemoveContainer" containerID="6acacb9ec6f1d9bfa9e075a19cc6321bc8687cf3b08b9c5116e34eab9fb6d894" Dec 02 09:58:16 crc kubenswrapper[4781]: E1202 09:58:16.192306 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6acacb9ec6f1d9bfa9e075a19cc6321bc8687cf3b08b9c5116e34eab9fb6d894\": container with ID starting with 6acacb9ec6f1d9bfa9e075a19cc6321bc8687cf3b08b9c5116e34eab9fb6d894 not found: ID does not exist" containerID="6acacb9ec6f1d9bfa9e075a19cc6321bc8687cf3b08b9c5116e34eab9fb6d894" Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.192364 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6acacb9ec6f1d9bfa9e075a19cc6321bc8687cf3b08b9c5116e34eab9fb6d894"} err="failed to get container status \"6acacb9ec6f1d9bfa9e075a19cc6321bc8687cf3b08b9c5116e34eab9fb6d894\": rpc error: code = NotFound desc = could not find container \"6acacb9ec6f1d9bfa9e075a19cc6321bc8687cf3b08b9c5116e34eab9fb6d894\": container with ID starting with 6acacb9ec6f1d9bfa9e075a19cc6321bc8687cf3b08b9c5116e34eab9fb6d894 not found: ID does not exist" Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.430209 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hn9h4" Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.467803 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-utilities\") pod \"305a2fb1-c716-4f4f-9168-03df5cf5ff6a\" (UID: \"305a2fb1-c716-4f4f-9168-03df5cf5ff6a\") " Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.467982 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jdhj\" (UniqueName: \"kubernetes.io/projected/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-kube-api-access-5jdhj\") pod \"305a2fb1-c716-4f4f-9168-03df5cf5ff6a\" (UID: \"305a2fb1-c716-4f4f-9168-03df5cf5ff6a\") " Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.468193 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-catalog-content\") pod \"305a2fb1-c716-4f4f-9168-03df5cf5ff6a\" (UID: \"305a2fb1-c716-4f4f-9168-03df5cf5ff6a\") " Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.468674 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-utilities" (OuterVolumeSpecName: "utilities") pod "305a2fb1-c716-4f4f-9168-03df5cf5ff6a" (UID: "305a2fb1-c716-4f4f-9168-03df5cf5ff6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.472065 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-kube-api-access-5jdhj" (OuterVolumeSpecName: "kube-api-access-5jdhj") pod "305a2fb1-c716-4f4f-9168-03df5cf5ff6a" (UID: "305a2fb1-c716-4f4f-9168-03df5cf5ff6a"). InnerVolumeSpecName "kube-api-access-5jdhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.518411 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "305a2fb1-c716-4f4f-9168-03df5cf5ff6a" (UID: "305a2fb1-c716-4f4f-9168-03df5cf5ff6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.570695 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jdhj\" (UniqueName: \"kubernetes.io/projected/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-kube-api-access-5jdhj\") on node \"crc\" DevicePath \"\"" Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.570731 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.570742 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305a2fb1-c716-4f4f-9168-03df5cf5ff6a-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.987567 4781 generic.go:334] "Generic (PLEG): container finished" podID="305a2fb1-c716-4f4f-9168-03df5cf5ff6a" containerID="811b7f0996ab9579be7dc6ad3b664c6509eaeabb73b52a7b096d86ec4c05d07e" exitCode=0 Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.987613 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hn9h4" event={"ID":"305a2fb1-c716-4f4f-9168-03df5cf5ff6a","Type":"ContainerDied","Data":"811b7f0996ab9579be7dc6ad3b664c6509eaeabb73b52a7b096d86ec4c05d07e"} Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.987975 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hn9h4" event={"ID":"305a2fb1-c716-4f4f-9168-03df5cf5ff6a","Type":"ContainerDied","Data":"c0af9f29325f4a6b46fc4d866ff5416b48aea5fe3a1281863160c6745147811c"} Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.987998 4781 scope.go:117] "RemoveContainer" containerID="811b7f0996ab9579be7dc6ad3b664c6509eaeabb73b52a7b096d86ec4c05d07e" Dec 02 09:58:16 crc kubenswrapper[4781]: I1202 09:58:16.987632 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hn9h4" Dec 02 09:58:17 crc kubenswrapper[4781]: I1202 09:58:17.022155 4781 scope.go:117] "RemoveContainer" containerID="ddbe2573605c1cdb12ef8708bea1ab474d287742b08404244aa06a7b61cf289f" Dec 02 09:58:17 crc kubenswrapper[4781]: I1202 09:58:17.041411 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hn9h4"] Dec 02 09:58:17 crc kubenswrapper[4781]: I1202 09:58:17.048429 4781 scope.go:117] "RemoveContainer" containerID="f5c382d6189da44643e82dec7cdac70bab62b6478f998c4341235a34ca41857d" Dec 02 09:58:17 crc kubenswrapper[4781]: I1202 09:58:17.049281 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hn9h4"] Dec 02 09:58:17 crc kubenswrapper[4781]: I1202 09:58:17.069546 4781 scope.go:117] "RemoveContainer" containerID="811b7f0996ab9579be7dc6ad3b664c6509eaeabb73b52a7b096d86ec4c05d07e" Dec 02 09:58:17 crc kubenswrapper[4781]: E1202 09:58:17.071353 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"811b7f0996ab9579be7dc6ad3b664c6509eaeabb73b52a7b096d86ec4c05d07e\": container with ID starting with 811b7f0996ab9579be7dc6ad3b664c6509eaeabb73b52a7b096d86ec4c05d07e not found: ID does not exist" containerID="811b7f0996ab9579be7dc6ad3b664c6509eaeabb73b52a7b096d86ec4c05d07e" Dec 02 09:58:17 crc kubenswrapper[4781]: I1202 09:58:17.071395 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"811b7f0996ab9579be7dc6ad3b664c6509eaeabb73b52a7b096d86ec4c05d07e"} err="failed to get container status \"811b7f0996ab9579be7dc6ad3b664c6509eaeabb73b52a7b096d86ec4c05d07e\": rpc error: code = NotFound desc = could not find container \"811b7f0996ab9579be7dc6ad3b664c6509eaeabb73b52a7b096d86ec4c05d07e\": container with ID starting with 811b7f0996ab9579be7dc6ad3b664c6509eaeabb73b52a7b096d86ec4c05d07e not found: ID does not exist" Dec 02 09:58:17 crc kubenswrapper[4781]: I1202 09:58:17.071423 4781 scope.go:117] "RemoveContainer" containerID="ddbe2573605c1cdb12ef8708bea1ab474d287742b08404244aa06a7b61cf289f" Dec 02 09:58:17 crc kubenswrapper[4781]: E1202 09:58:17.072047 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddbe2573605c1cdb12ef8708bea1ab474d287742b08404244aa06a7b61cf289f\": container with ID starting with ddbe2573605c1cdb12ef8708bea1ab474d287742b08404244aa06a7b61cf289f not found: ID does not exist" containerID="ddbe2573605c1cdb12ef8708bea1ab474d287742b08404244aa06a7b61cf289f" Dec 02 09:58:17 crc kubenswrapper[4781]: I1202 09:58:17.072079 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddbe2573605c1cdb12ef8708bea1ab474d287742b08404244aa06a7b61cf289f"} err="failed to get container status \"ddbe2573605c1cdb12ef8708bea1ab474d287742b08404244aa06a7b61cf289f\": rpc error: code = NotFound desc = could not find container \"ddbe2573605c1cdb12ef8708bea1ab474d287742b08404244aa06a7b61cf289f\": container with ID starting with ddbe2573605c1cdb12ef8708bea1ab474d287742b08404244aa06a7b61cf289f not found: ID does not exist" Dec 02 09:58:17 crc kubenswrapper[4781]: I1202 09:58:17.072098 4781 scope.go:117] "RemoveContainer" containerID="f5c382d6189da44643e82dec7cdac70bab62b6478f998c4341235a34ca41857d" Dec 02 09:58:17 crc kubenswrapper[4781]: E1202 09:58:17.072455 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5c382d6189da44643e82dec7cdac70bab62b6478f998c4341235a34ca41857d\": container with ID starting with f5c382d6189da44643e82dec7cdac70bab62b6478f998c4341235a34ca41857d not found: ID does not exist" containerID="f5c382d6189da44643e82dec7cdac70bab62b6478f998c4341235a34ca41857d" Dec 02 09:58:17 crc kubenswrapper[4781]: I1202 09:58:17.072474 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5c382d6189da44643e82dec7cdac70bab62b6478f998c4341235a34ca41857d"} err="failed to get container status \"f5c382d6189da44643e82dec7cdac70bab62b6478f998c4341235a34ca41857d\": rpc error: code = NotFound desc = could not find container \"f5c382d6189da44643e82dec7cdac70bab62b6478f998c4341235a34ca41857d\": container with ID starting with f5c382d6189da44643e82dec7cdac70bab62b6478f998c4341235a34ca41857d not found: ID does not exist" Dec 02 09:58:17 crc kubenswrapper[4781]: I1202 09:58:17.532116 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="305a2fb1-c716-4f4f-9168-03df5cf5ff6a" path="/var/lib/kubelet/pods/305a2fb1-c716-4f4f-9168-03df5cf5ff6a/volumes" Dec 02 09:58:17 crc kubenswrapper[4781]: I1202 09:58:17.533019 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdef86d0-746d-496a-bb85-9f107917bdb2" path="/var/lib/kubelet/pods/bdef86d0-746d-496a-bb85-9f107917bdb2/volumes" Dec 02 09:58:25 crc kubenswrapper[4781]: I1202 09:58:25.107568 4781 generic.go:334] "Generic (PLEG): container finished" podID="0fcef48b-9cfd-4f26-9964-3a083b035119" containerID="1cefba4bc2ef8a39b77c73d1a8b2ddc43360578c511cdca0abd8622d0b075e75" exitCode=0 Dec 02 09:58:25 crc kubenswrapper[4781]: I1202 09:58:25.107639 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" event={"ID":"0fcef48b-9cfd-4f26-9964-3a083b035119","Type":"ContainerDied","Data":"1cefba4bc2ef8a39b77c73d1a8b2ddc43360578c511cdca0abd8622d0b075e75"} Dec 02 09:58:26 crc kubenswrapper[4781]: I1202 09:58:26.546804 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" Dec 02 09:58:26 crc kubenswrapper[4781]: I1202 09:58:26.730961 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2x94\" (UniqueName: \"kubernetes.io/projected/0fcef48b-9cfd-4f26-9964-3a083b035119-kube-api-access-c2x94\") pod \"0fcef48b-9cfd-4f26-9964-3a083b035119\" (UID: \"0fcef48b-9cfd-4f26-9964-3a083b035119\") " Dec 02 09:58:26 crc kubenswrapper[4781]: I1202 09:58:26.731619 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fcef48b-9cfd-4f26-9964-3a083b035119-ssh-key\") pod \"0fcef48b-9cfd-4f26-9964-3a083b035119\" (UID: \"0fcef48b-9cfd-4f26-9964-3a083b035119\") " Dec 02 09:58:26 crc kubenswrapper[4781]: I1202 09:58:26.731706 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fcef48b-9cfd-4f26-9964-3a083b035119-inventory\") pod \"0fcef48b-9cfd-4f26-9964-3a083b035119\" (UID: \"0fcef48b-9cfd-4f26-9964-3a083b035119\") " Dec 02 09:58:26 crc kubenswrapper[4781]: I1202 09:58:26.737709 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fcef48b-9cfd-4f26-9964-3a083b035119-kube-api-access-c2x94" (OuterVolumeSpecName: "kube-api-access-c2x94") pod "0fcef48b-9cfd-4f26-9964-3a083b035119" (UID: "0fcef48b-9cfd-4f26-9964-3a083b035119"). InnerVolumeSpecName "kube-api-access-c2x94". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:58:26 crc kubenswrapper[4781]: I1202 09:58:26.759808 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fcef48b-9cfd-4f26-9964-3a083b035119-inventory" (OuterVolumeSpecName: "inventory") pod "0fcef48b-9cfd-4f26-9964-3a083b035119" (UID: "0fcef48b-9cfd-4f26-9964-3a083b035119"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:58:26 crc kubenswrapper[4781]: I1202 09:58:26.780030 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fcef48b-9cfd-4f26-9964-3a083b035119-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0fcef48b-9cfd-4f26-9964-3a083b035119" (UID: "0fcef48b-9cfd-4f26-9964-3a083b035119"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:58:26 crc kubenswrapper[4781]: I1202 09:58:26.835053 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fcef48b-9cfd-4f26-9964-3a083b035119-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:58:26 crc kubenswrapper[4781]: I1202 09:58:26.835288 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2x94\" (UniqueName: \"kubernetes.io/projected/0fcef48b-9cfd-4f26-9964-3a083b035119-kube-api-access-c2x94\") on node \"crc\" DevicePath \"\"" Dec 02 09:58:26 crc kubenswrapper[4781]: I1202 09:58:26.835422 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fcef48b-9cfd-4f26-9964-3a083b035119-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.144386 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" event={"ID":"0fcef48b-9cfd-4f26-9964-3a083b035119","Type":"ContainerDied","Data":"77edeb72236596e1f3a768e9bb0e175a12cce68331c88a07fbfcfd0517bceddf"} Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.144445 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77edeb72236596e1f3a768e9bb0e175a12cce68331c88a07fbfcfd0517bceddf" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.144560 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.209264 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cxdx6"] Dec 02 09:58:27 crc kubenswrapper[4781]: E1202 09:58:27.209713 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdef86d0-746d-496a-bb85-9f107917bdb2" containerName="extract-content" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.209730 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdef86d0-746d-496a-bb85-9f107917bdb2" containerName="extract-content" Dec 02 09:58:27 crc kubenswrapper[4781]: E1202 09:58:27.209744 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdef86d0-746d-496a-bb85-9f107917bdb2" containerName="registry-server" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.209751 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdef86d0-746d-496a-bb85-9f107917bdb2" containerName="registry-server" Dec 02 09:58:27 crc kubenswrapper[4781]: E1202 09:58:27.209776 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305a2fb1-c716-4f4f-9168-03df5cf5ff6a" containerName="registry-server" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.209783 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="305a2fb1-c716-4f4f-9168-03df5cf5ff6a" containerName="registry-server" Dec 02 09:58:27 crc kubenswrapper[4781]: E1202 09:58:27.209794 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305a2fb1-c716-4f4f-9168-03df5cf5ff6a" containerName="extract-content" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.209800 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="305a2fb1-c716-4f4f-9168-03df5cf5ff6a" containerName="extract-content" Dec 02 09:58:27 crc kubenswrapper[4781]: E1202 09:58:27.209811 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fcef48b-9cfd-4f26-9964-3a083b035119" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.209818 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fcef48b-9cfd-4f26-9964-3a083b035119" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 09:58:27 crc kubenswrapper[4781]: E1202 09:58:27.209827 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305a2fb1-c716-4f4f-9168-03df5cf5ff6a" containerName="extract-utilities" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.209833 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="305a2fb1-c716-4f4f-9168-03df5cf5ff6a" containerName="extract-utilities" Dec 02 09:58:27 crc kubenswrapper[4781]: E1202 09:58:27.209845 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdef86d0-746d-496a-bb85-9f107917bdb2" containerName="extract-utilities" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.209852 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdef86d0-746d-496a-bb85-9f107917bdb2" containerName="extract-utilities" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.210032 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdef86d0-746d-496a-bb85-9f107917bdb2" containerName="registry-server" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.210044 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="305a2fb1-c716-4f4f-9168-03df5cf5ff6a" containerName="registry-server" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.210063 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fcef48b-9cfd-4f26-9964-3a083b035119" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.210683 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.213433 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.213640 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.213683 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.213816 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zl624" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.218209 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cxdx6"] Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.242807 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxzbc\" (UniqueName: \"kubernetes.io/projected/6fe7b21a-17d2-432b-9045-e64643581770-kube-api-access-dxzbc\") pod \"ssh-known-hosts-edpm-deployment-cxdx6\" (UID: \"6fe7b21a-17d2-432b-9045-e64643581770\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.242875 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fe7b21a-17d2-432b-9045-e64643581770-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cxdx6\" (UID: \"6fe7b21a-17d2-432b-9045-e64643581770\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.242900 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6fe7b21a-17d2-432b-9045-e64643581770-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cxdx6\" (UID: \"6fe7b21a-17d2-432b-9045-e64643581770\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.344399 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxzbc\" (UniqueName: \"kubernetes.io/projected/6fe7b21a-17d2-432b-9045-e64643581770-kube-api-access-dxzbc\") pod \"ssh-known-hosts-edpm-deployment-cxdx6\" (UID: \"6fe7b21a-17d2-432b-9045-e64643581770\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.344487 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fe7b21a-17d2-432b-9045-e64643581770-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cxdx6\" (UID: \"6fe7b21a-17d2-432b-9045-e64643581770\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.344511 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6fe7b21a-17d2-432b-9045-e64643581770-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cxdx6\" (UID: \"6fe7b21a-17d2-432b-9045-e64643581770\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.349026 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6fe7b21a-17d2-432b-9045-e64643581770-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cxdx6\" (UID: \"6fe7b21a-17d2-432b-9045-e64643581770\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.349104 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fe7b21a-17d2-432b-9045-e64643581770-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cxdx6\" (UID: \"6fe7b21a-17d2-432b-9045-e64643581770\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.362805 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxzbc\" (UniqueName: \"kubernetes.io/projected/6fe7b21a-17d2-432b-9045-e64643581770-kube-api-access-dxzbc\") pod \"ssh-known-hosts-edpm-deployment-cxdx6\" (UID: \"6fe7b21a-17d2-432b-9045-e64643581770\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" Dec 02 09:58:27 crc kubenswrapper[4781]: I1202 09:58:27.539352 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" Dec 02 09:58:28 crc kubenswrapper[4781]: I1202 09:58:28.083352 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cxdx6"] Dec 02 09:58:28 crc kubenswrapper[4781]: I1202 09:58:28.153390 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" event={"ID":"6fe7b21a-17d2-432b-9045-e64643581770","Type":"ContainerStarted","Data":"251f45b000fe15c06e1f358c2bff0e3b34fee34ec14b35b1e06db34450311b14"} Dec 02 09:58:29 crc kubenswrapper[4781]: I1202 09:58:29.163814 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" event={"ID":"6fe7b21a-17d2-432b-9045-e64643581770","Type":"ContainerStarted","Data":"c9b146ce8b19ef41ad97936bbb4f5f3f4fd439c481bf3e58b337c90b6a4e74fb"} Dec 02 09:58:29 crc kubenswrapper[4781]: I1202 09:58:29.205592 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" podStartSLOduration=1.507687943 podStartE2EDuration="2.205565273s" podCreationTimestamp="2025-12-02 09:58:27 +0000 UTC" firstStartedPulling="2025-12-02 09:58:28.086235496 +0000 UTC m=+2270.910109375" lastFinishedPulling="2025-12-02 09:58:28.784112826 +0000 UTC m=+2271.607986705" observedRunningTime="2025-12-02 09:58:29.196590203 +0000 UTC m=+2272.020464082" watchObservedRunningTime="2025-12-02 09:58:29.205565273 +0000 UTC m=+2272.029439162" Dec 02 09:58:30 crc kubenswrapper[4781]: I1202 09:58:30.412543 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:58:30 crc kubenswrapper[4781]: I1202 09:58:30.412611 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:58:36 crc kubenswrapper[4781]: I1202 09:58:36.223759 4781 generic.go:334] "Generic (PLEG): container finished" podID="6fe7b21a-17d2-432b-9045-e64643581770" containerID="c9b146ce8b19ef41ad97936bbb4f5f3f4fd439c481bf3e58b337c90b6a4e74fb" exitCode=0 Dec 02 09:58:36 crc kubenswrapper[4781]: I1202 09:58:36.223833 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" event={"ID":"6fe7b21a-17d2-432b-9045-e64643581770","Type":"ContainerDied","Data":"c9b146ce8b19ef41ad97936bbb4f5f3f4fd439c481bf3e58b337c90b6a4e74fb"} Dec 02 09:58:37 crc kubenswrapper[4781]: I1202 09:58:37.663191 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" Dec 02 09:58:37 crc kubenswrapper[4781]: I1202 09:58:37.762492 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fe7b21a-17d2-432b-9045-e64643581770-ssh-key-openstack-edpm-ipam\") pod \"6fe7b21a-17d2-432b-9045-e64643581770\" (UID: \"6fe7b21a-17d2-432b-9045-e64643581770\") " Dec 02 09:58:37 crc kubenswrapper[4781]: I1202 09:58:37.763471 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6fe7b21a-17d2-432b-9045-e64643581770-inventory-0\") pod \"6fe7b21a-17d2-432b-9045-e64643581770\" (UID: \"6fe7b21a-17d2-432b-9045-e64643581770\") " Dec 02 09:58:37 crc kubenswrapper[4781]: I1202 09:58:37.763779 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxzbc\" (UniqueName: \"kubernetes.io/projected/6fe7b21a-17d2-432b-9045-e64643581770-kube-api-access-dxzbc\") pod \"6fe7b21a-17d2-432b-9045-e64643581770\" (UID: \"6fe7b21a-17d2-432b-9045-e64643581770\") " Dec 02 09:58:37 crc kubenswrapper[4781]: I1202 09:58:37.768743 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe7b21a-17d2-432b-9045-e64643581770-kube-api-access-dxzbc" (OuterVolumeSpecName: "kube-api-access-dxzbc") pod "6fe7b21a-17d2-432b-9045-e64643581770" (UID: "6fe7b21a-17d2-432b-9045-e64643581770"). InnerVolumeSpecName "kube-api-access-dxzbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:58:37 crc kubenswrapper[4781]: I1202 09:58:37.792904 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe7b21a-17d2-432b-9045-e64643581770-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6fe7b21a-17d2-432b-9045-e64643581770" (UID: "6fe7b21a-17d2-432b-9045-e64643581770"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:58:37 crc kubenswrapper[4781]: I1202 09:58:37.793294 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe7b21a-17d2-432b-9045-e64643581770-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "6fe7b21a-17d2-432b-9045-e64643581770" (UID: "6fe7b21a-17d2-432b-9045-e64643581770"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:58:37 crc kubenswrapper[4781]: I1202 09:58:37.866638 4781 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6fe7b21a-17d2-432b-9045-e64643581770-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:58:37 crc kubenswrapper[4781]: I1202 09:58:37.866679 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxzbc\" (UniqueName: \"kubernetes.io/projected/6fe7b21a-17d2-432b-9045-e64643581770-kube-api-access-dxzbc\") on node \"crc\" DevicePath \"\"" Dec 02 09:58:37 crc kubenswrapper[4781]: I1202 09:58:37.866696 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fe7b21a-17d2-432b-9045-e64643581770-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.243702 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" event={"ID":"6fe7b21a-17d2-432b-9045-e64643581770","Type":"ContainerDied","Data":"251f45b000fe15c06e1f358c2bff0e3b34fee34ec14b35b1e06db34450311b14"} Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.244085 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="251f45b000fe15c06e1f358c2bff0e3b34fee34ec14b35b1e06db34450311b14" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.244053 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cxdx6" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.320047 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b"] Dec 02 09:58:38 crc kubenswrapper[4781]: E1202 09:58:38.320755 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe7b21a-17d2-432b-9045-e64643581770" containerName="ssh-known-hosts-edpm-deployment" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.320830 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe7b21a-17d2-432b-9045-e64643581770" containerName="ssh-known-hosts-edpm-deployment" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.321113 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe7b21a-17d2-432b-9045-e64643581770" containerName="ssh-known-hosts-edpm-deployment" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.321889 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.323842 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.324094 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.326230 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.326695 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zl624" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.328865 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b"] Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.374644 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7687840f-e133-4b18-b37c-74664863276b-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gbx4b\" (UID: \"7687840f-e133-4b18-b37c-74664863276b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.374978 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7687840f-e133-4b18-b37c-74664863276b-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gbx4b\" (UID: \"7687840f-e133-4b18-b37c-74664863276b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.375044 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4wlg\" (UniqueName: \"kubernetes.io/projected/7687840f-e133-4b18-b37c-74664863276b-kube-api-access-j4wlg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gbx4b\" (UID: \"7687840f-e133-4b18-b37c-74664863276b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.476219 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4wlg\" (UniqueName: \"kubernetes.io/projected/7687840f-e133-4b18-b37c-74664863276b-kube-api-access-j4wlg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gbx4b\" (UID: \"7687840f-e133-4b18-b37c-74664863276b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.476433 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7687840f-e133-4b18-b37c-74664863276b-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gbx4b\" (UID: \"7687840f-e133-4b18-b37c-74664863276b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.476472 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7687840f-e133-4b18-b37c-74664863276b-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gbx4b\" (UID: \"7687840f-e133-4b18-b37c-74664863276b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.481772 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7687840f-e133-4b18-b37c-74664863276b-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gbx4b\" (UID: \"7687840f-e133-4b18-b37c-74664863276b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.482173 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7687840f-e133-4b18-b37c-74664863276b-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gbx4b\" (UID: \"7687840f-e133-4b18-b37c-74664863276b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.494194 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4wlg\" (UniqueName: \"kubernetes.io/projected/7687840f-e133-4b18-b37c-74664863276b-kube-api-access-j4wlg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gbx4b\" (UID: \"7687840f-e133-4b18-b37c-74664863276b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" Dec 02 09:58:38 crc kubenswrapper[4781]: I1202 09:58:38.647670 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" Dec 02 09:58:39 crc kubenswrapper[4781]: I1202 09:58:39.136365 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b"] Dec 02 09:58:39 crc kubenswrapper[4781]: I1202 09:58:39.260984 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" event={"ID":"7687840f-e133-4b18-b37c-74664863276b","Type":"ContainerStarted","Data":"830d4b45d0a523744904247786ee5129cacddbf13098bed167f559bba9a8a14f"} Dec 02 09:58:40 crc kubenswrapper[4781]: I1202 09:58:40.277690 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" event={"ID":"7687840f-e133-4b18-b37c-74664863276b","Type":"ContainerStarted","Data":"17ab28bb5ee2c1207c9d295179817157d9a9c4b30e1b08f6a82271bce9125f7a"} Dec 02 09:58:40 crc kubenswrapper[4781]: I1202 09:58:40.300327 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" podStartSLOduration=1.7805418629999998 podStartE2EDuration="2.300306746s" podCreationTimestamp="2025-12-02 09:58:38 +0000 UTC" firstStartedPulling="2025-12-02 09:58:39.143548859 +0000 UTC m=+2281.967422738" lastFinishedPulling="2025-12-02 09:58:39.663313742 +0000 UTC m=+2282.487187621" observedRunningTime="2025-12-02 09:58:40.297370068 +0000 UTC m=+2283.121243947" watchObservedRunningTime="2025-12-02 09:58:40.300306746 +0000 UTC m=+2283.124180625" Dec 02 09:58:44 crc kubenswrapper[4781]: I1202 09:58:44.518602 4781 scope.go:117] "RemoveContainer" containerID="38c79d0e42445913d308d278577a2e36da560fc134a0ce61a7bacd91900b4887" Dec 02 09:58:48 crc kubenswrapper[4781]: I1202 09:58:48.346729 4781 generic.go:334] "Generic (PLEG): container finished" podID="7687840f-e133-4b18-b37c-74664863276b" containerID="17ab28bb5ee2c1207c9d295179817157d9a9c4b30e1b08f6a82271bce9125f7a" exitCode=0 Dec 02 09:58:48 crc kubenswrapper[4781]: I1202 09:58:48.346817 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" event={"ID":"7687840f-e133-4b18-b37c-74664863276b","Type":"ContainerDied","Data":"17ab28bb5ee2c1207c9d295179817157d9a9c4b30e1b08f6a82271bce9125f7a"} Dec 02 09:58:49 crc kubenswrapper[4781]: I1202 09:58:49.794247 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" Dec 02 09:58:49 crc kubenswrapper[4781]: I1202 09:58:49.897062 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7687840f-e133-4b18-b37c-74664863276b-inventory\") pod \"7687840f-e133-4b18-b37c-74664863276b\" (UID: \"7687840f-e133-4b18-b37c-74664863276b\") " Dec 02 09:58:49 crc kubenswrapper[4781]: I1202 09:58:49.897125 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4wlg\" (UniqueName: \"kubernetes.io/projected/7687840f-e133-4b18-b37c-74664863276b-kube-api-access-j4wlg\") pod \"7687840f-e133-4b18-b37c-74664863276b\" (UID: \"7687840f-e133-4b18-b37c-74664863276b\") " Dec 02 09:58:49 crc kubenswrapper[4781]: I1202 09:58:49.897288 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7687840f-e133-4b18-b37c-74664863276b-ssh-key\") pod \"7687840f-e133-4b18-b37c-74664863276b\" (UID: \"7687840f-e133-4b18-b37c-74664863276b\") " Dec 02 09:58:49 crc kubenswrapper[4781]: I1202 09:58:49.903295 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7687840f-e133-4b18-b37c-74664863276b-kube-api-access-j4wlg" (OuterVolumeSpecName: "kube-api-access-j4wlg") pod "7687840f-e133-4b18-b37c-74664863276b" (UID: "7687840f-e133-4b18-b37c-74664863276b"). InnerVolumeSpecName "kube-api-access-j4wlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:58:49 crc kubenswrapper[4781]: I1202 09:58:49.928627 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7687840f-e133-4b18-b37c-74664863276b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7687840f-e133-4b18-b37c-74664863276b" (UID: "7687840f-e133-4b18-b37c-74664863276b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:58:49 crc kubenswrapper[4781]: I1202 09:58:49.930336 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7687840f-e133-4b18-b37c-74664863276b-inventory" (OuterVolumeSpecName: "inventory") pod "7687840f-e133-4b18-b37c-74664863276b" (UID: "7687840f-e133-4b18-b37c-74664863276b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.000796 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7687840f-e133-4b18-b37c-74664863276b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.000846 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7687840f-e133-4b18-b37c-74664863276b-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.000858 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4wlg\" (UniqueName: \"kubernetes.io/projected/7687840f-e133-4b18-b37c-74664863276b-kube-api-access-j4wlg\") on node \"crc\" DevicePath \"\"" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.368128 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" event={"ID":"7687840f-e133-4b18-b37c-74664863276b","Type":"ContainerDied","Data":"830d4b45d0a523744904247786ee5129cacddbf13098bed167f559bba9a8a14f"} Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.368174 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="830d4b45d0a523744904247786ee5129cacddbf13098bed167f559bba9a8a14f" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.368177 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gbx4b" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.445959 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd"] Dec 02 09:58:50 crc kubenswrapper[4781]: E1202 09:58:50.447397 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7687840f-e133-4b18-b37c-74664863276b" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.447422 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7687840f-e133-4b18-b37c-74664863276b" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.447591 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7687840f-e133-4b18-b37c-74664863276b" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.448290 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.451385 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.451597 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zl624" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.451695 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.452117 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.457834 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd"] Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.611443 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e76bbee2-a10a-45f8-9767-4018dfa3836e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd\" (UID: \"e76bbee2-a10a-45f8-9767-4018dfa3836e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.611510 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6v46\" (UniqueName: \"kubernetes.io/projected/e76bbee2-a10a-45f8-9767-4018dfa3836e-kube-api-access-n6v46\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd\" (UID: \"e76bbee2-a10a-45f8-9767-4018dfa3836e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.611580 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e76bbee2-a10a-45f8-9767-4018dfa3836e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd\" (UID: \"e76bbee2-a10a-45f8-9767-4018dfa3836e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.713315 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e76bbee2-a10a-45f8-9767-4018dfa3836e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd\" (UID: \"e76bbee2-a10a-45f8-9767-4018dfa3836e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.713629 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6v46\" (UniqueName: \"kubernetes.io/projected/e76bbee2-a10a-45f8-9767-4018dfa3836e-kube-api-access-n6v46\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd\" (UID: \"e76bbee2-a10a-45f8-9767-4018dfa3836e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.713683 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e76bbee2-a10a-45f8-9767-4018dfa3836e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd\" (UID: \"e76bbee2-a10a-45f8-9767-4018dfa3836e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.717460 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e76bbee2-a10a-45f8-9767-4018dfa3836e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd\" (UID: \"e76bbee2-a10a-45f8-9767-4018dfa3836e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.717626 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e76bbee2-a10a-45f8-9767-4018dfa3836e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd\" (UID: \"e76bbee2-a10a-45f8-9767-4018dfa3836e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.731973 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6v46\" (UniqueName: \"kubernetes.io/projected/e76bbee2-a10a-45f8-9767-4018dfa3836e-kube-api-access-n6v46\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd\" (UID: \"e76bbee2-a10a-45f8-9767-4018dfa3836e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" Dec 02 09:58:50 crc kubenswrapper[4781]: I1202 09:58:50.767092 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" Dec 02 09:58:51 crc kubenswrapper[4781]: I1202 09:58:51.248760 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd"] Dec 02 09:58:51 crc kubenswrapper[4781]: I1202 09:58:51.378621 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" event={"ID":"e76bbee2-a10a-45f8-9767-4018dfa3836e","Type":"ContainerStarted","Data":"9719207c880ff3d3a76a57f7243b2fa9900acc1040de14412674fbbf2b746a6e"} Dec 02 09:58:52 crc kubenswrapper[4781]: I1202 09:58:52.389432 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" event={"ID":"e76bbee2-a10a-45f8-9767-4018dfa3836e","Type":"ContainerStarted","Data":"13fecc5feed1c18f5285628b8edac4efc0d8c938f444fd788b8769ba914ac441"} Dec 02 09:58:52 crc kubenswrapper[4781]: I1202 09:58:52.412752 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" podStartSLOduration=1.565206983 podStartE2EDuration="2.412729191s" podCreationTimestamp="2025-12-02 09:58:50 +0000 UTC" firstStartedPulling="2025-12-02 09:58:51.257857394 +0000 UTC m=+2294.081731273" lastFinishedPulling="2025-12-02 09:58:52.105379602 +0000 UTC m=+2294.929253481" observedRunningTime="2025-12-02 09:58:52.403258728 +0000 UTC m=+2295.227132617" watchObservedRunningTime="2025-12-02 09:58:52.412729191 +0000 UTC m=+2295.236603070" Dec 02 09:59:00 crc kubenswrapper[4781]: I1202 09:59:00.413132 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:59:00 crc kubenswrapper[4781]: I1202 09:59:00.414365 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:59:02 crc kubenswrapper[4781]: I1202 09:59:02.479312 4781 generic.go:334] "Generic (PLEG): container finished" podID="e76bbee2-a10a-45f8-9767-4018dfa3836e" containerID="13fecc5feed1c18f5285628b8edac4efc0d8c938f444fd788b8769ba914ac441" exitCode=0 Dec 02 09:59:02 crc kubenswrapper[4781]: I1202 09:59:02.479485 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" event={"ID":"e76bbee2-a10a-45f8-9767-4018dfa3836e","Type":"ContainerDied","Data":"13fecc5feed1c18f5285628b8edac4efc0d8c938f444fd788b8769ba914ac441"} Dec 02 09:59:03 crc kubenswrapper[4781]: I1202 09:59:03.919053 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.058230 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e76bbee2-a10a-45f8-9767-4018dfa3836e-ssh-key\") pod \"e76bbee2-a10a-45f8-9767-4018dfa3836e\" (UID: \"e76bbee2-a10a-45f8-9767-4018dfa3836e\") " Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.058317 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e76bbee2-a10a-45f8-9767-4018dfa3836e-inventory\") pod \"e76bbee2-a10a-45f8-9767-4018dfa3836e\" (UID: \"e76bbee2-a10a-45f8-9767-4018dfa3836e\") " Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.058419 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6v46\" (UniqueName: \"kubernetes.io/projected/e76bbee2-a10a-45f8-9767-4018dfa3836e-kube-api-access-n6v46\") pod \"e76bbee2-a10a-45f8-9767-4018dfa3836e\" (UID: \"e76bbee2-a10a-45f8-9767-4018dfa3836e\") " Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.067308 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e76bbee2-a10a-45f8-9767-4018dfa3836e-kube-api-access-n6v46" (OuterVolumeSpecName: "kube-api-access-n6v46") pod "e76bbee2-a10a-45f8-9767-4018dfa3836e" (UID: "e76bbee2-a10a-45f8-9767-4018dfa3836e"). InnerVolumeSpecName "kube-api-access-n6v46". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.084331 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e76bbee2-a10a-45f8-9767-4018dfa3836e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e76bbee2-a10a-45f8-9767-4018dfa3836e" (UID: "e76bbee2-a10a-45f8-9767-4018dfa3836e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.084595 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e76bbee2-a10a-45f8-9767-4018dfa3836e-inventory" (OuterVolumeSpecName: "inventory") pod "e76bbee2-a10a-45f8-9767-4018dfa3836e" (UID: "e76bbee2-a10a-45f8-9767-4018dfa3836e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.160458 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e76bbee2-a10a-45f8-9767-4018dfa3836e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.160493 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e76bbee2-a10a-45f8-9767-4018dfa3836e-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.160506 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6v46\" (UniqueName: \"kubernetes.io/projected/e76bbee2-a10a-45f8-9767-4018dfa3836e-kube-api-access-n6v46\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.503520 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" event={"ID":"e76bbee2-a10a-45f8-9767-4018dfa3836e","Type":"ContainerDied","Data":"9719207c880ff3d3a76a57f7243b2fa9900acc1040de14412674fbbf2b746a6e"} Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.504584 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9719207c880ff3d3a76a57f7243b2fa9900acc1040de14412674fbbf2b746a6e" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.503578 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.579824 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j"] Dec 02 09:59:04 crc kubenswrapper[4781]: E1202 09:59:04.580326 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e76bbee2-a10a-45f8-9767-4018dfa3836e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.580351 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e76bbee2-a10a-45f8-9767-4018dfa3836e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.580593 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e76bbee2-a10a-45f8-9767-4018dfa3836e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.581375 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.584160 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zl624" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.584226 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.584995 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.585339 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.585469 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.585572 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.586021 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.587701 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.598810 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j"] Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.669369 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.669618 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.669726 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.669840 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.669982 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.670162 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.670239 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.670458 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzwjz\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-kube-api-access-fzwjz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.670542 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.670659 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.670697 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.670737 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.670833 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.671062 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.773196 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.773251 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.773277 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.773303 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.773340 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.773382 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.773399 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.773425 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.773451 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.773487 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzwjz\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-kube-api-access-fzwjz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.773511 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.773542 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.773560 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.773582 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.779191 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.779707 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.779786 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.780020 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.780196 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.780812 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.780969 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.780978 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.781710 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.782052 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.782401 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.783960 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.784515 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.795543 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzwjz\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-kube-api-access-fzwjz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:04 crc kubenswrapper[4781]: I1202 09:59:04.898816 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:05 crc kubenswrapper[4781]: I1202 09:59:05.485503 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j"] Dec 02 09:59:05 crc kubenswrapper[4781]: I1202 09:59:05.518078 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" event={"ID":"82e18c11-6a85-45d3-8794-c7d7d02aaa2d","Type":"ContainerStarted","Data":"d592a0696380285d46c95db8cc700baa90cebcf5f0369c24e0afe812d9b8cdfb"} Dec 02 09:59:07 crc kubenswrapper[4781]: I1202 09:59:07.533020 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" event={"ID":"82e18c11-6a85-45d3-8794-c7d7d02aaa2d","Type":"ContainerStarted","Data":"b90ecb124361d79df5553c3d93aff48e90fcdae20cbc6f73503561fad13ea458"} Dec 02 09:59:07 crc kubenswrapper[4781]: I1202 09:59:07.562268 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" podStartSLOduration=2.960466744 podStartE2EDuration="3.562244018s" podCreationTimestamp="2025-12-02 09:59:04 +0000 UTC" firstStartedPulling="2025-12-02 09:59:05.495766132 +0000 UTC m=+2308.319640011" lastFinishedPulling="2025-12-02 09:59:06.097543396 +0000 UTC m=+2308.921417285" observedRunningTime="2025-12-02 09:59:07.55855675 +0000 UTC m=+2310.382430619" watchObservedRunningTime="2025-12-02 09:59:07.562244018 +0000 UTC m=+2310.386117897" Dec 02 09:59:30 crc kubenswrapper[4781]: I1202 09:59:30.411788 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:59:30 crc kubenswrapper[4781]: I1202 09:59:30.412335 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:59:30 crc kubenswrapper[4781]: I1202 09:59:30.412383 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 09:59:30 crc kubenswrapper[4781]: I1202 09:59:30.413225 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d"} pod="openshift-machine-config-operator/machine-config-daemon-pzntm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:59:30 crc kubenswrapper[4781]: I1202 09:59:30.413291 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" containerID="cri-o://910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" gracePeriod=600 Dec 02 09:59:30 crc kubenswrapper[4781]: E1202 09:59:30.682948 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:59:30 crc kubenswrapper[4781]: I1202 09:59:30.724223 4781 generic.go:334] "Generic (PLEG): container finished" podID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" exitCode=0 Dec 02 09:59:30 crc kubenswrapper[4781]: I1202 09:59:30.724297 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerDied","Data":"910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d"} Dec 02 09:59:30 crc kubenswrapper[4781]: I1202 09:59:30.724386 4781 scope.go:117] "RemoveContainer" containerID="05bb140e8bed83dcb63b9b78720e5b065dcdd65808d271a4b06533687d8cc9f3" Dec 02 09:59:30 crc kubenswrapper[4781]: I1202 09:59:30.725200 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 09:59:30 crc kubenswrapper[4781]: E1202 09:59:30.725486 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:59:43 crc kubenswrapper[4781]: I1202 09:59:43.500134 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 09:59:43 crc kubenswrapper[4781]: E1202 09:59:43.501017 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 09:59:44 crc kubenswrapper[4781]: I1202 09:59:44.844832 4781 generic.go:334] "Generic (PLEG): container finished" podID="82e18c11-6a85-45d3-8794-c7d7d02aaa2d" containerID="b90ecb124361d79df5553c3d93aff48e90fcdae20cbc6f73503561fad13ea458" exitCode=0 Dec 02 09:59:44 crc kubenswrapper[4781]: I1202 09:59:44.845030 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" event={"ID":"82e18c11-6a85-45d3-8794-c7d7d02aaa2d","Type":"ContainerDied","Data":"b90ecb124361d79df5553c3d93aff48e90fcdae20cbc6f73503561fad13ea458"} Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.256279 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.399740 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.399804 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-bootstrap-combined-ca-bundle\") pod \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.399891 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.399954 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.399990 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-libvirt-combined-ca-bundle\") pod \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.400052 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-telemetry-combined-ca-bundle\") pod \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.400081 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-ssh-key\") pod \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.400107 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzwjz\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-kube-api-access-fzwjz\") pod \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.400141 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-nova-combined-ca-bundle\") pod \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.400168 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-ovn-combined-ca-bundle\") pod \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.400201 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-inventory\") pod \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.400240 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-neutron-metadata-combined-ca-bundle\") pod \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.400289 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.400342 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-repo-setup-combined-ca-bundle\") pod \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\" (UID: \"82e18c11-6a85-45d3-8794-c7d7d02aaa2d\") " Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.408407 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "82e18c11-6a85-45d3-8794-c7d7d02aaa2d" (UID: "82e18c11-6a85-45d3-8794-c7d7d02aaa2d"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.409437 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "82e18c11-6a85-45d3-8794-c7d7d02aaa2d" (UID: "82e18c11-6a85-45d3-8794-c7d7d02aaa2d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.409613 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "82e18c11-6a85-45d3-8794-c7d7d02aaa2d" (UID: "82e18c11-6a85-45d3-8794-c7d7d02aaa2d"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.410366 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "82e18c11-6a85-45d3-8794-c7d7d02aaa2d" (UID: "82e18c11-6a85-45d3-8794-c7d7d02aaa2d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.410413 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "82e18c11-6a85-45d3-8794-c7d7d02aaa2d" (UID: "82e18c11-6a85-45d3-8794-c7d7d02aaa2d"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.410586 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-kube-api-access-fzwjz" (OuterVolumeSpecName: "kube-api-access-fzwjz") pod "82e18c11-6a85-45d3-8794-c7d7d02aaa2d" (UID: "82e18c11-6a85-45d3-8794-c7d7d02aaa2d"). InnerVolumeSpecName "kube-api-access-fzwjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.410698 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "82e18c11-6a85-45d3-8794-c7d7d02aaa2d" (UID: "82e18c11-6a85-45d3-8794-c7d7d02aaa2d"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.411898 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "82e18c11-6a85-45d3-8794-c7d7d02aaa2d" (UID: "82e18c11-6a85-45d3-8794-c7d7d02aaa2d"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.412504 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "82e18c11-6a85-45d3-8794-c7d7d02aaa2d" (UID: "82e18c11-6a85-45d3-8794-c7d7d02aaa2d"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.415438 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "82e18c11-6a85-45d3-8794-c7d7d02aaa2d" (UID: "82e18c11-6a85-45d3-8794-c7d7d02aaa2d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.422024 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "82e18c11-6a85-45d3-8794-c7d7d02aaa2d" (UID: "82e18c11-6a85-45d3-8794-c7d7d02aaa2d"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.427951 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "82e18c11-6a85-45d3-8794-c7d7d02aaa2d" (UID: "82e18c11-6a85-45d3-8794-c7d7d02aaa2d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.437509 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-inventory" (OuterVolumeSpecName: "inventory") pod "82e18c11-6a85-45d3-8794-c7d7d02aaa2d" (UID: "82e18c11-6a85-45d3-8794-c7d7d02aaa2d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.438850 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "82e18c11-6a85-45d3-8794-c7d7d02aaa2d" (UID: "82e18c11-6a85-45d3-8794-c7d7d02aaa2d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.503362 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.504279 4781 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.504328 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.504340 4781 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.504352 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.504367 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.504377 4781 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.504390 4781 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.504399 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.504408 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzwjz\" (UniqueName: \"kubernetes.io/projected/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-kube-api-access-fzwjz\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.504417 4781 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.504428 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.504436 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.504451 4781 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e18c11-6a85-45d3-8794-c7d7d02aaa2d-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.864551 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" event={"ID":"82e18c11-6a85-45d3-8794-c7d7d02aaa2d","Type":"ContainerDied","Data":"d592a0696380285d46c95db8cc700baa90cebcf5f0369c24e0afe812d9b8cdfb"} Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.864594 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d592a0696380285d46c95db8cc700baa90cebcf5f0369c24e0afe812d9b8cdfb" Dec 02 09:59:46 crc kubenswrapper[4781]: I1202 09:59:46.864972 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.020562 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh"] Dec 02 09:59:47 crc kubenswrapper[4781]: E1202 09:59:47.021012 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82e18c11-6a85-45d3-8794-c7d7d02aaa2d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.021033 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="82e18c11-6a85-45d3-8794-c7d7d02aaa2d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.021267 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="82e18c11-6a85-45d3-8794-c7d7d02aaa2d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.021889 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.024886 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.025235 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.025315 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.025361 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.025448 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zl624" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.035485 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh"] Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.216988 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nf4nh\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.217763 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h567h\" (UniqueName: \"kubernetes.io/projected/809035c6-50b2-4492-898e-1f2917e62a5c-kube-api-access-h567h\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nf4nh\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.218202 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nf4nh\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.218307 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/809035c6-50b2-4492-898e-1f2917e62a5c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nf4nh\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.218452 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nf4nh\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.320322 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nf4nh\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.320413 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/809035c6-50b2-4492-898e-1f2917e62a5c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nf4nh\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.320840 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nf4nh\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.321067 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nf4nh\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.321283 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h567h\" (UniqueName: \"kubernetes.io/projected/809035c6-50b2-4492-898e-1f2917e62a5c-kube-api-access-h567h\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nf4nh\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.321604 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/809035c6-50b2-4492-898e-1f2917e62a5c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nf4nh\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.325113 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nf4nh\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.325208 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nf4nh\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.326272 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nf4nh\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.341263 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h567h\" (UniqueName: \"kubernetes.io/projected/809035c6-50b2-4492-898e-1f2917e62a5c-kube-api-access-h567h\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nf4nh\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 09:59:47 crc kubenswrapper[4781]: I1202 09:59:47.638621 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 09:59:48 crc kubenswrapper[4781]: I1202 09:59:48.240598 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh"] Dec 02 09:59:48 crc kubenswrapper[4781]: I1202 09:59:48.890343 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" event={"ID":"809035c6-50b2-4492-898e-1f2917e62a5c","Type":"ContainerStarted","Data":"e2423a676c8c099e6bfa9a507cdafab6c698237d6da6cc6384de45bb4aa41f29"} Dec 02 09:59:49 crc kubenswrapper[4781]: I1202 09:59:49.900917 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" event={"ID":"809035c6-50b2-4492-898e-1f2917e62a5c","Type":"ContainerStarted","Data":"717276523da092da9052ad37700b45bb0c06f1867d1c0ba21c7498e3f5c4012c"} Dec 02 09:59:49 crc kubenswrapper[4781]: I1202 09:59:49.925648 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" podStartSLOduration=2.282176578 podStartE2EDuration="2.925630046s" podCreationTimestamp="2025-12-02 09:59:47 +0000 UTC" firstStartedPulling="2025-12-02 09:59:48.246200277 +0000 UTC m=+2351.070074156" lastFinishedPulling="2025-12-02 09:59:48.889653745 +0000 UTC m=+2351.713527624" observedRunningTime="2025-12-02 09:59:49.921065403 +0000 UTC m=+2352.744939292" watchObservedRunningTime="2025-12-02 09:59:49.925630046 +0000 UTC m=+2352.749503925" Dec 02 09:59:55 crc kubenswrapper[4781]: I1202 09:59:55.499877 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 09:59:55 crc kubenswrapper[4781]: E1202 09:59:55.501810 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:00:00 crc kubenswrapper[4781]: I1202 10:00:00.140074 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc"] Dec 02 10:00:00 crc kubenswrapper[4781]: I1202 10:00:00.142970 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc" Dec 02 10:00:00 crc kubenswrapper[4781]: I1202 10:00:00.145245 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 10:00:00 crc kubenswrapper[4781]: I1202 10:00:00.145732 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 10:00:00 crc kubenswrapper[4781]: I1202 10:00:00.149505 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc"] Dec 02 10:00:00 crc kubenswrapper[4781]: I1202 10:00:00.182723 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/745379ef-4ee9-48a9-bf62-f2e381441ba0-secret-volume\") pod \"collect-profiles-29411160-f4vqc\" (UID: \"745379ef-4ee9-48a9-bf62-f2e381441ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc" Dec 02 10:00:00 crc kubenswrapper[4781]: I1202 10:00:00.182861 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2l8m\" (UniqueName: \"kubernetes.io/projected/745379ef-4ee9-48a9-bf62-f2e381441ba0-kube-api-access-c2l8m\") pod \"collect-profiles-29411160-f4vqc\" (UID: \"745379ef-4ee9-48a9-bf62-f2e381441ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc" Dec 02 10:00:00 crc kubenswrapper[4781]: I1202 10:00:00.182993 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/745379ef-4ee9-48a9-bf62-f2e381441ba0-config-volume\") pod \"collect-profiles-29411160-f4vqc\" (UID: \"745379ef-4ee9-48a9-bf62-f2e381441ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc" Dec 02 10:00:00 crc kubenswrapper[4781]: I1202 10:00:00.285023 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/745379ef-4ee9-48a9-bf62-f2e381441ba0-secret-volume\") pod \"collect-profiles-29411160-f4vqc\" (UID: \"745379ef-4ee9-48a9-bf62-f2e381441ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc" Dec 02 10:00:00 crc kubenswrapper[4781]: I1202 10:00:00.285111 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2l8m\" (UniqueName: \"kubernetes.io/projected/745379ef-4ee9-48a9-bf62-f2e381441ba0-kube-api-access-c2l8m\") pod \"collect-profiles-29411160-f4vqc\" (UID: \"745379ef-4ee9-48a9-bf62-f2e381441ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc" Dec 02 10:00:00 crc kubenswrapper[4781]: I1202 10:00:00.285199 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/745379ef-4ee9-48a9-bf62-f2e381441ba0-config-volume\") pod \"collect-profiles-29411160-f4vqc\" (UID: \"745379ef-4ee9-48a9-bf62-f2e381441ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc" Dec 02 10:00:00 crc kubenswrapper[4781]: I1202 10:00:00.286032 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/745379ef-4ee9-48a9-bf62-f2e381441ba0-config-volume\") pod \"collect-profiles-29411160-f4vqc\" (UID: \"745379ef-4ee9-48a9-bf62-f2e381441ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc" Dec 02 10:00:00 crc kubenswrapper[4781]: I1202 10:00:00.292080 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/745379ef-4ee9-48a9-bf62-f2e381441ba0-secret-volume\") pod \"collect-profiles-29411160-f4vqc\" (UID: \"745379ef-4ee9-48a9-bf62-f2e381441ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc" Dec 02 10:00:00 crc kubenswrapper[4781]: I1202 10:00:00.304015 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2l8m\" (UniqueName: \"kubernetes.io/projected/745379ef-4ee9-48a9-bf62-f2e381441ba0-kube-api-access-c2l8m\") pod \"collect-profiles-29411160-f4vqc\" (UID: \"745379ef-4ee9-48a9-bf62-f2e381441ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc" Dec 02 10:00:00 crc kubenswrapper[4781]: I1202 10:00:00.473313 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc" Dec 02 10:00:00 crc kubenswrapper[4781]: I1202 10:00:00.888422 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc"] Dec 02 10:00:01 crc kubenswrapper[4781]: I1202 10:00:01.006143 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc" event={"ID":"745379ef-4ee9-48a9-bf62-f2e381441ba0","Type":"ContainerStarted","Data":"d24ac52b8dbda1cb4746774986fdf7187d3c3faaffe0662f2976651c3317d74a"} Dec 02 10:00:02 crc kubenswrapper[4781]: I1202 10:00:02.015804 4781 generic.go:334] "Generic (PLEG): container finished" podID="745379ef-4ee9-48a9-bf62-f2e381441ba0" containerID="e4868290bed0c5dc8e19a3c92c4da50f527c1352e38b2b7746ed4b24943ff3f0" exitCode=0 Dec 02 10:00:02 crc kubenswrapper[4781]: I1202 10:00:02.015874 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc" event={"ID":"745379ef-4ee9-48a9-bf62-f2e381441ba0","Type":"ContainerDied","Data":"e4868290bed0c5dc8e19a3c92c4da50f527c1352e38b2b7746ed4b24943ff3f0"} Dec 02 10:00:03 crc kubenswrapper[4781]: I1202 10:00:03.382053 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc" Dec 02 10:00:03 crc kubenswrapper[4781]: I1202 10:00:03.444550 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/745379ef-4ee9-48a9-bf62-f2e381441ba0-config-volume\") pod \"745379ef-4ee9-48a9-bf62-f2e381441ba0\" (UID: \"745379ef-4ee9-48a9-bf62-f2e381441ba0\") " Dec 02 10:00:03 crc kubenswrapper[4781]: I1202 10:00:03.444604 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2l8m\" (UniqueName: \"kubernetes.io/projected/745379ef-4ee9-48a9-bf62-f2e381441ba0-kube-api-access-c2l8m\") pod \"745379ef-4ee9-48a9-bf62-f2e381441ba0\" (UID: \"745379ef-4ee9-48a9-bf62-f2e381441ba0\") " Dec 02 10:00:03 crc kubenswrapper[4781]: I1202 10:00:03.444643 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/745379ef-4ee9-48a9-bf62-f2e381441ba0-secret-volume\") pod \"745379ef-4ee9-48a9-bf62-f2e381441ba0\" (UID: \"745379ef-4ee9-48a9-bf62-f2e381441ba0\") " Dec 02 10:00:03 crc kubenswrapper[4781]: I1202 10:00:03.445418 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/745379ef-4ee9-48a9-bf62-f2e381441ba0-config-volume" (OuterVolumeSpecName: "config-volume") pod "745379ef-4ee9-48a9-bf62-f2e381441ba0" (UID: "745379ef-4ee9-48a9-bf62-f2e381441ba0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 10:00:03 crc kubenswrapper[4781]: I1202 10:00:03.452153 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/745379ef-4ee9-48a9-bf62-f2e381441ba0-kube-api-access-c2l8m" (OuterVolumeSpecName: "kube-api-access-c2l8m") pod "745379ef-4ee9-48a9-bf62-f2e381441ba0" (UID: "745379ef-4ee9-48a9-bf62-f2e381441ba0"). InnerVolumeSpecName "kube-api-access-c2l8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:00:03 crc kubenswrapper[4781]: I1202 10:00:03.454261 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/745379ef-4ee9-48a9-bf62-f2e381441ba0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "745379ef-4ee9-48a9-bf62-f2e381441ba0" (UID: "745379ef-4ee9-48a9-bf62-f2e381441ba0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:00:03 crc kubenswrapper[4781]: I1202 10:00:03.547363 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/745379ef-4ee9-48a9-bf62-f2e381441ba0-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 10:00:03 crc kubenswrapper[4781]: I1202 10:00:03.552851 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2l8m\" (UniqueName: \"kubernetes.io/projected/745379ef-4ee9-48a9-bf62-f2e381441ba0-kube-api-access-c2l8m\") on node \"crc\" DevicePath \"\"" Dec 02 10:00:03 crc kubenswrapper[4781]: I1202 10:00:03.553113 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/745379ef-4ee9-48a9-bf62-f2e381441ba0-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 10:00:04 crc kubenswrapper[4781]: I1202 10:00:04.034833 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc" event={"ID":"745379ef-4ee9-48a9-bf62-f2e381441ba0","Type":"ContainerDied","Data":"d24ac52b8dbda1cb4746774986fdf7187d3c3faaffe0662f2976651c3317d74a"} Dec 02 10:00:04 crc kubenswrapper[4781]: I1202 10:00:04.034879 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d24ac52b8dbda1cb4746774986fdf7187d3c3faaffe0662f2976651c3317d74a" Dec 02 10:00:04 crc kubenswrapper[4781]: I1202 10:00:04.035236 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-f4vqc" Dec 02 10:00:04 crc kubenswrapper[4781]: I1202 10:00:04.457030 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg"] Dec 02 10:00:04 crc kubenswrapper[4781]: I1202 10:00:04.464595 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411115-gzjfg"] Dec 02 10:00:05 crc kubenswrapper[4781]: I1202 10:00:05.511532 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c23dd9-b165-4f35-9230-da18a16f48be" path="/var/lib/kubelet/pods/70c23dd9-b165-4f35-9230-da18a16f48be/volumes" Dec 02 10:00:09 crc kubenswrapper[4781]: I1202 10:00:09.499267 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:00:09 crc kubenswrapper[4781]: E1202 10:00:09.501097 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:00:21 crc kubenswrapper[4781]: I1202 10:00:21.500396 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:00:21 crc kubenswrapper[4781]: E1202 10:00:21.501226 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:00:36 crc kubenswrapper[4781]: I1202 10:00:36.500504 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:00:36 crc kubenswrapper[4781]: E1202 10:00:36.501301 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:00:44 crc kubenswrapper[4781]: I1202 10:00:44.703789 4781 scope.go:117] "RemoveContainer" containerID="ce3239c217f4a3357f252e1204730b8c33889a9b2c8970a6d82b0804b93be9de" Dec 02 10:00:47 crc kubenswrapper[4781]: I1202 10:00:47.502654 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:00:47 crc kubenswrapper[4781]: E1202 10:00:47.503524 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:00:52 crc kubenswrapper[4781]: I1202 10:00:52.437077 4781 generic.go:334] "Generic (PLEG): container finished" podID="809035c6-50b2-4492-898e-1f2917e62a5c" containerID="717276523da092da9052ad37700b45bb0c06f1867d1c0ba21c7498e3f5c4012c" exitCode=0 Dec 02 10:00:52 crc kubenswrapper[4781]: I1202 10:00:52.437131 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" event={"ID":"809035c6-50b2-4492-898e-1f2917e62a5c","Type":"ContainerDied","Data":"717276523da092da9052ad37700b45bb0c06f1867d1c0ba21c7498e3f5c4012c"} Dec 02 10:00:53 crc kubenswrapper[4781]: I1202 10:00:53.846617 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 10:00:53 crc kubenswrapper[4781]: I1202 10:00:53.955181 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h567h\" (UniqueName: \"kubernetes.io/projected/809035c6-50b2-4492-898e-1f2917e62a5c-kube-api-access-h567h\") pod \"809035c6-50b2-4492-898e-1f2917e62a5c\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " Dec 02 10:00:53 crc kubenswrapper[4781]: I1202 10:00:53.955241 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-inventory\") pod \"809035c6-50b2-4492-898e-1f2917e62a5c\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " Dec 02 10:00:53 crc kubenswrapper[4781]: I1202 10:00:53.955287 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/809035c6-50b2-4492-898e-1f2917e62a5c-ovncontroller-config-0\") pod \"809035c6-50b2-4492-898e-1f2917e62a5c\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " Dec 02 10:00:53 crc kubenswrapper[4781]: I1202 10:00:53.955363 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-ovn-combined-ca-bundle\") pod \"809035c6-50b2-4492-898e-1f2917e62a5c\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " Dec 02 10:00:53 crc kubenswrapper[4781]: I1202 10:00:53.955496 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-ssh-key\") pod \"809035c6-50b2-4492-898e-1f2917e62a5c\" (UID: \"809035c6-50b2-4492-898e-1f2917e62a5c\") " Dec 02 10:00:53 crc kubenswrapper[4781]: I1202 10:00:53.961331 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/809035c6-50b2-4492-898e-1f2917e62a5c-kube-api-access-h567h" (OuterVolumeSpecName: "kube-api-access-h567h") pod "809035c6-50b2-4492-898e-1f2917e62a5c" (UID: "809035c6-50b2-4492-898e-1f2917e62a5c"). InnerVolumeSpecName "kube-api-access-h567h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:00:53 crc kubenswrapper[4781]: I1202 10:00:53.962168 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "809035c6-50b2-4492-898e-1f2917e62a5c" (UID: "809035c6-50b2-4492-898e-1f2917e62a5c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:00:53 crc kubenswrapper[4781]: I1202 10:00:53.985740 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/809035c6-50b2-4492-898e-1f2917e62a5c-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "809035c6-50b2-4492-898e-1f2917e62a5c" (UID: "809035c6-50b2-4492-898e-1f2917e62a5c"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 10:00:53 crc kubenswrapper[4781]: I1202 10:00:53.987978 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "809035c6-50b2-4492-898e-1f2917e62a5c" (UID: "809035c6-50b2-4492-898e-1f2917e62a5c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:00:53 crc kubenswrapper[4781]: I1202 10:00:53.988215 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-inventory" (OuterVolumeSpecName: "inventory") pod "809035c6-50b2-4492-898e-1f2917e62a5c" (UID: "809035c6-50b2-4492-898e-1f2917e62a5c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.057955 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.057986 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h567h\" (UniqueName: \"kubernetes.io/projected/809035c6-50b2-4492-898e-1f2917e62a5c-kube-api-access-h567h\") on node \"crc\" DevicePath \"\"" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.057999 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.058008 4781 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/809035c6-50b2-4492-898e-1f2917e62a5c-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.058019 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809035c6-50b2-4492-898e-1f2917e62a5c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.460441 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" event={"ID":"809035c6-50b2-4492-898e-1f2917e62a5c","Type":"ContainerDied","Data":"e2423a676c8c099e6bfa9a507cdafab6c698237d6da6cc6384de45bb4aa41f29"} Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.460496 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2423a676c8c099e6bfa9a507cdafab6c698237d6da6cc6384de45bb4aa41f29" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.460518 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nf4nh" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.572694 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd"] Dec 02 10:00:54 crc kubenswrapper[4781]: E1202 10:00:54.573198 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="809035c6-50b2-4492-898e-1f2917e62a5c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.573218 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="809035c6-50b2-4492-898e-1f2917e62a5c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 10:00:54 crc kubenswrapper[4781]: E1202 10:00:54.573247 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="745379ef-4ee9-48a9-bf62-f2e381441ba0" containerName="collect-profiles" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.573253 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="745379ef-4ee9-48a9-bf62-f2e381441ba0" containerName="collect-profiles" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.573449 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="809035c6-50b2-4492-898e-1f2917e62a5c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.573482 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="745379ef-4ee9-48a9-bf62-f2e381441ba0" containerName="collect-profiles" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.574197 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.576593 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zl624" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.576900 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.577009 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.577227 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.578278 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.578446 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.588613 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd"] Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.673228 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.673465 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.673531 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.673737 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh5mv\" (UniqueName: \"kubernetes.io/projected/c2171979-7791-4850-a4cf-99ac7e62d054-kube-api-access-zh5mv\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.673806 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.673944 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.775070 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.775139 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.775170 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.775205 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh5mv\" (UniqueName: \"kubernetes.io/projected/c2171979-7791-4850-a4cf-99ac7e62d054-kube-api-access-zh5mv\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.775236 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.775280 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.779563 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.779595 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.780413 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.781424 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.783572 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.807157 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh5mv\" (UniqueName: \"kubernetes.io/projected/c2171979-7791-4850-a4cf-99ac7e62d054-kube-api-access-zh5mv\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:54 crc kubenswrapper[4781]: I1202 10:00:54.900164 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:00:55 crc kubenswrapper[4781]: I1202 10:00:55.438353 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd"] Dec 02 10:00:55 crc kubenswrapper[4781]: I1202 10:00:55.442897 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 10:00:55 crc kubenswrapper[4781]: I1202 10:00:55.469101 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" event={"ID":"c2171979-7791-4850-a4cf-99ac7e62d054","Type":"ContainerStarted","Data":"75dbaa09020f952ae23bd73d90a7fcbcf35038de65685bcd09a2515dfb260757"} Dec 02 10:00:56 crc kubenswrapper[4781]: I1202 10:00:56.480876 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" event={"ID":"c2171979-7791-4850-a4cf-99ac7e62d054","Type":"ContainerStarted","Data":"f8bdd905f6a4480a9d96de0922763ab40aeab5d1d285a7f6e0da75f20d6f0dbd"} Dec 02 10:00:56 crc kubenswrapper[4781]: I1202 10:00:56.503277 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" podStartSLOduration=1.89357233 podStartE2EDuration="2.503254418s" podCreationTimestamp="2025-12-02 10:00:54 +0000 UTC" firstStartedPulling="2025-12-02 10:00:55.442690775 +0000 UTC m=+2418.266564644" lastFinishedPulling="2025-12-02 10:00:56.052372853 +0000 UTC m=+2418.876246732" observedRunningTime="2025-12-02 10:00:56.497223469 +0000 UTC m=+2419.321097348" watchObservedRunningTime="2025-12-02 10:00:56.503254418 +0000 UTC m=+2419.327128307" Dec 02 10:00:59 crc kubenswrapper[4781]: I1202 10:00:59.499740 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:00:59 crc kubenswrapper[4781]: E1202 10:00:59.500659 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:01:00 crc kubenswrapper[4781]: I1202 10:01:00.137812 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29411161-pl6b8"] Dec 02 10:01:00 crc kubenswrapper[4781]: I1202 10:01:00.142511 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411161-pl6b8" Dec 02 10:01:00 crc kubenswrapper[4781]: I1202 10:01:00.149029 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29411161-pl6b8"] Dec 02 10:01:00 crc kubenswrapper[4781]: I1202 10:01:00.181821 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-combined-ca-bundle\") pod \"keystone-cron-29411161-pl6b8\" (UID: \"da3b7144-110c-45af-a358-804809a89670\") " pod="openstack/keystone-cron-29411161-pl6b8" Dec 02 10:01:00 crc kubenswrapper[4781]: I1202 10:01:00.181876 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-fernet-keys\") pod \"keystone-cron-29411161-pl6b8\" (UID: \"da3b7144-110c-45af-a358-804809a89670\") " pod="openstack/keystone-cron-29411161-pl6b8" Dec 02 10:01:00 crc kubenswrapper[4781]: I1202 10:01:00.182040 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-config-data\") pod \"keystone-cron-29411161-pl6b8\" (UID: \"da3b7144-110c-45af-a358-804809a89670\") " pod="openstack/keystone-cron-29411161-pl6b8" Dec 02 10:01:00 crc kubenswrapper[4781]: I1202 10:01:00.182107 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94q4j\" (UniqueName: \"kubernetes.io/projected/da3b7144-110c-45af-a358-804809a89670-kube-api-access-94q4j\") pod \"keystone-cron-29411161-pl6b8\" (UID: \"da3b7144-110c-45af-a358-804809a89670\") " pod="openstack/keystone-cron-29411161-pl6b8" Dec 02 10:01:00 crc kubenswrapper[4781]: I1202 10:01:00.284497 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-combined-ca-bundle\") pod \"keystone-cron-29411161-pl6b8\" (UID: \"da3b7144-110c-45af-a358-804809a89670\") " pod="openstack/keystone-cron-29411161-pl6b8" Dec 02 10:01:00 crc kubenswrapper[4781]: I1202 10:01:00.284549 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-fernet-keys\") pod \"keystone-cron-29411161-pl6b8\" (UID: \"da3b7144-110c-45af-a358-804809a89670\") " pod="openstack/keystone-cron-29411161-pl6b8" Dec 02 10:01:00 crc kubenswrapper[4781]: I1202 10:01:00.284643 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-config-data\") pod \"keystone-cron-29411161-pl6b8\" (UID: \"da3b7144-110c-45af-a358-804809a89670\") " pod="openstack/keystone-cron-29411161-pl6b8" Dec 02 10:01:00 crc kubenswrapper[4781]: I1202 10:01:00.284703 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94q4j\" (UniqueName: \"kubernetes.io/projected/da3b7144-110c-45af-a358-804809a89670-kube-api-access-94q4j\") pod \"keystone-cron-29411161-pl6b8\" (UID: \"da3b7144-110c-45af-a358-804809a89670\") " pod="openstack/keystone-cron-29411161-pl6b8" Dec 02 10:01:00 crc kubenswrapper[4781]: I1202 10:01:00.290736 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-fernet-keys\") pod \"keystone-cron-29411161-pl6b8\" (UID: \"da3b7144-110c-45af-a358-804809a89670\") " pod="openstack/keystone-cron-29411161-pl6b8" Dec 02 10:01:00 crc kubenswrapper[4781]: I1202 10:01:00.292887 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-config-data\") pod \"keystone-cron-29411161-pl6b8\" (UID: \"da3b7144-110c-45af-a358-804809a89670\") " pod="openstack/keystone-cron-29411161-pl6b8" Dec 02 10:01:00 crc kubenswrapper[4781]: I1202 10:01:00.295034 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-combined-ca-bundle\") pod \"keystone-cron-29411161-pl6b8\" (UID: \"da3b7144-110c-45af-a358-804809a89670\") " pod="openstack/keystone-cron-29411161-pl6b8" Dec 02 10:01:00 crc kubenswrapper[4781]: I1202 10:01:00.303700 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94q4j\" (UniqueName: \"kubernetes.io/projected/da3b7144-110c-45af-a358-804809a89670-kube-api-access-94q4j\") pod \"keystone-cron-29411161-pl6b8\" (UID: \"da3b7144-110c-45af-a358-804809a89670\") " pod="openstack/keystone-cron-29411161-pl6b8" Dec 02 10:01:00 crc kubenswrapper[4781]: I1202 10:01:00.466442 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411161-pl6b8" Dec 02 10:01:00 crc kubenswrapper[4781]: I1202 10:01:00.889045 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29411161-pl6b8"] Dec 02 10:01:01 crc kubenswrapper[4781]: I1202 10:01:01.534370 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411161-pl6b8" event={"ID":"da3b7144-110c-45af-a358-804809a89670","Type":"ContainerStarted","Data":"7426c295028270c9fae5faf5574806c6b94ba333f2f2aa3fafed90627192e755"} Dec 02 10:01:01 crc kubenswrapper[4781]: I1202 10:01:01.534676 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411161-pl6b8" event={"ID":"da3b7144-110c-45af-a358-804809a89670","Type":"ContainerStarted","Data":"8731fa9dc6fe0ce52c2d57cf6896964cc2386c067ce321a3aae5730811e118c0"} Dec 02 10:01:01 crc kubenswrapper[4781]: I1202 10:01:01.554110 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29411161-pl6b8" podStartSLOduration=1.5540877709999998 podStartE2EDuration="1.554087771s" podCreationTimestamp="2025-12-02 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 10:01:01.550437094 +0000 UTC m=+2424.374310973" watchObservedRunningTime="2025-12-02 10:01:01.554087771 +0000 UTC m=+2424.377961650" Dec 02 10:01:03 crc kubenswrapper[4781]: I1202 10:01:03.556101 4781 generic.go:334] "Generic (PLEG): container finished" podID="da3b7144-110c-45af-a358-804809a89670" containerID="7426c295028270c9fae5faf5574806c6b94ba333f2f2aa3fafed90627192e755" exitCode=0 Dec 02 10:01:03 crc kubenswrapper[4781]: I1202 10:01:03.556213 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411161-pl6b8" event={"ID":"da3b7144-110c-45af-a358-804809a89670","Type":"ContainerDied","Data":"7426c295028270c9fae5faf5574806c6b94ba333f2f2aa3fafed90627192e755"} Dec 02 10:01:04 crc kubenswrapper[4781]: I1202 10:01:04.975983 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411161-pl6b8" Dec 02 10:01:05 crc kubenswrapper[4781]: I1202 10:01:05.090688 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94q4j\" (UniqueName: \"kubernetes.io/projected/da3b7144-110c-45af-a358-804809a89670-kube-api-access-94q4j\") pod \"da3b7144-110c-45af-a358-804809a89670\" (UID: \"da3b7144-110c-45af-a358-804809a89670\") " Dec 02 10:01:05 crc kubenswrapper[4781]: I1202 10:01:05.090733 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-config-data\") pod \"da3b7144-110c-45af-a358-804809a89670\" (UID: \"da3b7144-110c-45af-a358-804809a89670\") " Dec 02 10:01:05 crc kubenswrapper[4781]: I1202 10:01:05.090783 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-combined-ca-bundle\") pod \"da3b7144-110c-45af-a358-804809a89670\" (UID: \"da3b7144-110c-45af-a358-804809a89670\") " Dec 02 10:01:05 crc kubenswrapper[4781]: I1202 10:01:05.090865 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-fernet-keys\") pod \"da3b7144-110c-45af-a358-804809a89670\" (UID: \"da3b7144-110c-45af-a358-804809a89670\") " Dec 02 10:01:05 crc kubenswrapper[4781]: I1202 10:01:05.097048 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "da3b7144-110c-45af-a358-804809a89670" (UID: "da3b7144-110c-45af-a358-804809a89670"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:01:05 crc kubenswrapper[4781]: I1202 10:01:05.109417 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da3b7144-110c-45af-a358-804809a89670-kube-api-access-94q4j" (OuterVolumeSpecName: "kube-api-access-94q4j") pod "da3b7144-110c-45af-a358-804809a89670" (UID: "da3b7144-110c-45af-a358-804809a89670"). InnerVolumeSpecName "kube-api-access-94q4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:01:05 crc kubenswrapper[4781]: I1202 10:01:05.123450 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da3b7144-110c-45af-a358-804809a89670" (UID: "da3b7144-110c-45af-a358-804809a89670"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:01:05 crc kubenswrapper[4781]: I1202 10:01:05.150788 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-config-data" (OuterVolumeSpecName: "config-data") pod "da3b7144-110c-45af-a358-804809a89670" (UID: "da3b7144-110c-45af-a358-804809a89670"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:01:05 crc kubenswrapper[4781]: I1202 10:01:05.192624 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94q4j\" (UniqueName: \"kubernetes.io/projected/da3b7144-110c-45af-a358-804809a89670-kube-api-access-94q4j\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:05 crc kubenswrapper[4781]: I1202 10:01:05.192658 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:05 crc kubenswrapper[4781]: I1202 10:01:05.192668 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:05 crc kubenswrapper[4781]: I1202 10:01:05.192677 4781 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da3b7144-110c-45af-a358-804809a89670-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:05 crc kubenswrapper[4781]: I1202 10:01:05.571455 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411161-pl6b8" event={"ID":"da3b7144-110c-45af-a358-804809a89670","Type":"ContainerDied","Data":"8731fa9dc6fe0ce52c2d57cf6896964cc2386c067ce321a3aae5730811e118c0"} Dec 02 10:01:05 crc kubenswrapper[4781]: I1202 10:01:05.571756 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8731fa9dc6fe0ce52c2d57cf6896964cc2386c067ce321a3aae5730811e118c0" Dec 02 10:01:05 crc kubenswrapper[4781]: I1202 10:01:05.571503 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411161-pl6b8" Dec 02 10:01:10 crc kubenswrapper[4781]: I1202 10:01:10.500066 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:01:10 crc kubenswrapper[4781]: E1202 10:01:10.501181 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:01:21 crc kubenswrapper[4781]: I1202 10:01:21.500182 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:01:21 crc kubenswrapper[4781]: E1202 10:01:21.500939 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:01:34 crc kubenswrapper[4781]: I1202 10:01:34.499684 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:01:34 crc kubenswrapper[4781]: E1202 10:01:34.501073 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:01:43 crc kubenswrapper[4781]: I1202 10:01:43.011155 4781 generic.go:334] "Generic (PLEG): container finished" podID="c2171979-7791-4850-a4cf-99ac7e62d054" containerID="f8bdd905f6a4480a9d96de0922763ab40aeab5d1d285a7f6e0da75f20d6f0dbd" exitCode=0 Dec 02 10:01:43 crc kubenswrapper[4781]: I1202 10:01:43.011240 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" event={"ID":"c2171979-7791-4850-a4cf-99ac7e62d054","Type":"ContainerDied","Data":"f8bdd905f6a4480a9d96de0922763ab40aeab5d1d285a7f6e0da75f20d6f0dbd"} Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.415245 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.608875 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-neutron-ovn-metadata-agent-neutron-config-0\") pod \"c2171979-7791-4850-a4cf-99ac7e62d054\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.608946 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh5mv\" (UniqueName: \"kubernetes.io/projected/c2171979-7791-4850-a4cf-99ac7e62d054-kube-api-access-zh5mv\") pod \"c2171979-7791-4850-a4cf-99ac7e62d054\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.609012 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-ssh-key\") pod \"c2171979-7791-4850-a4cf-99ac7e62d054\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.609036 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-neutron-metadata-combined-ca-bundle\") pod \"c2171979-7791-4850-a4cf-99ac7e62d054\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.609231 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-inventory\") pod \"c2171979-7791-4850-a4cf-99ac7e62d054\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.609259 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-nova-metadata-neutron-config-0\") pod \"c2171979-7791-4850-a4cf-99ac7e62d054\" (UID: \"c2171979-7791-4850-a4cf-99ac7e62d054\") " Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.620640 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c2171979-7791-4850-a4cf-99ac7e62d054" (UID: "c2171979-7791-4850-a4cf-99ac7e62d054"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.620772 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2171979-7791-4850-a4cf-99ac7e62d054-kube-api-access-zh5mv" (OuterVolumeSpecName: "kube-api-access-zh5mv") pod "c2171979-7791-4850-a4cf-99ac7e62d054" (UID: "c2171979-7791-4850-a4cf-99ac7e62d054"). InnerVolumeSpecName "kube-api-access-zh5mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.637470 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "c2171979-7791-4850-a4cf-99ac7e62d054" (UID: "c2171979-7791-4850-a4cf-99ac7e62d054"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.638568 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-inventory" (OuterVolumeSpecName: "inventory") pod "c2171979-7791-4850-a4cf-99ac7e62d054" (UID: "c2171979-7791-4850-a4cf-99ac7e62d054"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.638994 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c2171979-7791-4850-a4cf-99ac7e62d054" (UID: "c2171979-7791-4850-a4cf-99ac7e62d054"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.640837 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "c2171979-7791-4850-a4cf-99ac7e62d054" (UID: "c2171979-7791-4850-a4cf-99ac7e62d054"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.715177 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.715221 4781 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.715238 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.715250 4781 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.715264 4781 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c2171979-7791-4850-a4cf-99ac7e62d054-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:44 crc kubenswrapper[4781]: I1202 10:01:44.715277 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh5mv\" (UniqueName: \"kubernetes.io/projected/c2171979-7791-4850-a4cf-99ac7e62d054-kube-api-access-zh5mv\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.029235 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" event={"ID":"c2171979-7791-4850-a4cf-99ac7e62d054","Type":"ContainerDied","Data":"75dbaa09020f952ae23bd73d90a7fcbcf35038de65685bcd09a2515dfb260757"} Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.029283 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75dbaa09020f952ae23bd73d90a7fcbcf35038de65685bcd09a2515dfb260757" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.029334 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.120236 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9"] Dec 02 10:01:45 crc kubenswrapper[4781]: E1202 10:01:45.120714 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3b7144-110c-45af-a358-804809a89670" containerName="keystone-cron" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.120735 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3b7144-110c-45af-a358-804809a89670" containerName="keystone-cron" Dec 02 10:01:45 crc kubenswrapper[4781]: E1202 10:01:45.120756 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2171979-7791-4850-a4cf-99ac7e62d054" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.120765 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2171979-7791-4850-a4cf-99ac7e62d054" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.120995 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="da3b7144-110c-45af-a358-804809a89670" containerName="keystone-cron" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.121022 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2171979-7791-4850-a4cf-99ac7e62d054" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.121739 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.123841 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.124492 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.124660 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zl624" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.125147 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.127621 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.134460 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9"] Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.224873 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.224945 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.224979 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.225053 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmb26\" (UniqueName: \"kubernetes.io/projected/8db18e94-bcea-4e3a-8759-65fb8084cd43-kube-api-access-dmb26\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.225088 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.327140 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.327655 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.327716 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.327752 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.327799 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmb26\" (UniqueName: \"kubernetes.io/projected/8db18e94-bcea-4e3a-8759-65fb8084cd43-kube-api-access-dmb26\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.331376 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.332584 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.332772 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.333210 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.346716 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmb26\" (UniqueName: \"kubernetes.io/projected/8db18e94-bcea-4e3a-8759-65fb8084cd43-kube-api-access-dmb26\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.442747 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:01:45 crc kubenswrapper[4781]: I1202 10:01:45.946997 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9"] Dec 02 10:01:46 crc kubenswrapper[4781]: I1202 10:01:46.042119 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" event={"ID":"8db18e94-bcea-4e3a-8759-65fb8084cd43","Type":"ContainerStarted","Data":"8db5fd8a0e887c07b3566a0cc22a25506ac7021480e313a16e404e9b1f62a02b"} Dec 02 10:01:48 crc kubenswrapper[4781]: I1202 10:01:48.058418 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" event={"ID":"8db18e94-bcea-4e3a-8759-65fb8084cd43","Type":"ContainerStarted","Data":"fe78bcaefe87cfae7254020778cd2c0e2d6752c9e50f2443f42f35a3a86f1365"} Dec 02 10:01:48 crc kubenswrapper[4781]: I1202 10:01:48.087233 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" podStartSLOduration=1.94340562 podStartE2EDuration="3.087213707s" podCreationTimestamp="2025-12-02 10:01:45 +0000 UTC" firstStartedPulling="2025-12-02 10:01:45.946259207 +0000 UTC m=+2468.770133086" lastFinishedPulling="2025-12-02 10:01:47.090067294 +0000 UTC m=+2469.913941173" observedRunningTime="2025-12-02 10:01:48.078160809 +0000 UTC m=+2470.902034688" watchObservedRunningTime="2025-12-02 10:01:48.087213707 +0000 UTC m=+2470.911087586" Dec 02 10:01:49 crc kubenswrapper[4781]: I1202 10:01:49.499567 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:01:49 crc kubenswrapper[4781]: E1202 10:01:49.500364 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:02:03 crc kubenswrapper[4781]: I1202 10:02:03.499694 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:02:03 crc kubenswrapper[4781]: E1202 10:02:03.500646 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:02:18 crc kubenswrapper[4781]: I1202 10:02:18.500131 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:02:18 crc kubenswrapper[4781]: E1202 10:02:18.501065 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:02:31 crc kubenswrapper[4781]: I1202 10:02:31.499801 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:02:31 crc kubenswrapper[4781]: E1202 10:02:31.500554 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:02:44 crc kubenswrapper[4781]: I1202 10:02:44.500380 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:02:44 crc kubenswrapper[4781]: E1202 10:02:44.503050 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:02:57 crc kubenswrapper[4781]: I1202 10:02:57.522204 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:02:57 crc kubenswrapper[4781]: E1202 10:02:57.523510 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:03:12 crc kubenswrapper[4781]: I1202 10:03:12.500984 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:03:12 crc kubenswrapper[4781]: E1202 10:03:12.502113 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:03:23 crc kubenswrapper[4781]: I1202 10:03:23.499614 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:03:23 crc kubenswrapper[4781]: E1202 10:03:23.501457 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:03:36 crc kubenswrapper[4781]: I1202 10:03:36.527161 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:03:36 crc kubenswrapper[4781]: E1202 10:03:36.528270 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:03:50 crc kubenswrapper[4781]: I1202 10:03:50.500298 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:03:50 crc kubenswrapper[4781]: E1202 10:03:50.501072 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:04:02 crc kubenswrapper[4781]: I1202 10:04:02.499846 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:04:02 crc kubenswrapper[4781]: E1202 10:04:02.500604 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:04:15 crc kubenswrapper[4781]: I1202 10:04:15.500675 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:04:15 crc kubenswrapper[4781]: E1202 10:04:15.501524 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:04:26 crc kubenswrapper[4781]: I1202 10:04:26.499445 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:04:26 crc kubenswrapper[4781]: E1202 10:04:26.500330 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:04:40 crc kubenswrapper[4781]: I1202 10:04:40.499495 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:04:41 crc kubenswrapper[4781]: I1202 10:04:41.653563 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"75b74b7096d1dd0ce909b2158eaf421368960aa32dc50742e6efb662b8a4be29"} Dec 02 10:06:02 crc kubenswrapper[4781]: I1202 10:06:02.459625 4781 generic.go:334] "Generic (PLEG): container finished" podID="8db18e94-bcea-4e3a-8759-65fb8084cd43" containerID="fe78bcaefe87cfae7254020778cd2c0e2d6752c9e50f2443f42f35a3a86f1365" exitCode=0 Dec 02 10:06:02 crc kubenswrapper[4781]: I1202 10:06:02.459760 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" event={"ID":"8db18e94-bcea-4e3a-8759-65fb8084cd43","Type":"ContainerDied","Data":"fe78bcaefe87cfae7254020778cd2c0e2d6752c9e50f2443f42f35a3a86f1365"} Dec 02 10:06:03 crc kubenswrapper[4781]: I1202 10:06:03.859822 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:06:03 crc kubenswrapper[4781]: I1202 10:06:03.995608 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-inventory\") pod \"8db18e94-bcea-4e3a-8759-65fb8084cd43\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " Dec 02 10:06:03 crc kubenswrapper[4781]: I1202 10:06:03.995705 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-libvirt-secret-0\") pod \"8db18e94-bcea-4e3a-8759-65fb8084cd43\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " Dec 02 10:06:03 crc kubenswrapper[4781]: I1202 10:06:03.995746 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-ssh-key\") pod \"8db18e94-bcea-4e3a-8759-65fb8084cd43\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " Dec 02 10:06:03 crc kubenswrapper[4781]: I1202 10:06:03.996549 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-libvirt-combined-ca-bundle\") pod \"8db18e94-bcea-4e3a-8759-65fb8084cd43\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " Dec 02 10:06:03 crc kubenswrapper[4781]: I1202 10:06:03.996597 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmb26\" (UniqueName: \"kubernetes.io/projected/8db18e94-bcea-4e3a-8759-65fb8084cd43-kube-api-access-dmb26\") pod \"8db18e94-bcea-4e3a-8759-65fb8084cd43\" (UID: \"8db18e94-bcea-4e3a-8759-65fb8084cd43\") " Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.005290 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8db18e94-bcea-4e3a-8759-65fb8084cd43" (UID: "8db18e94-bcea-4e3a-8759-65fb8084cd43"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.008029 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db18e94-bcea-4e3a-8759-65fb8084cd43-kube-api-access-dmb26" (OuterVolumeSpecName: "kube-api-access-dmb26") pod "8db18e94-bcea-4e3a-8759-65fb8084cd43" (UID: "8db18e94-bcea-4e3a-8759-65fb8084cd43"). InnerVolumeSpecName "kube-api-access-dmb26". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.026687 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "8db18e94-bcea-4e3a-8759-65fb8084cd43" (UID: "8db18e94-bcea-4e3a-8759-65fb8084cd43"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.027376 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-inventory" (OuterVolumeSpecName: "inventory") pod "8db18e94-bcea-4e3a-8759-65fb8084cd43" (UID: "8db18e94-bcea-4e3a-8759-65fb8084cd43"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.027789 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8db18e94-bcea-4e3a-8759-65fb8084cd43" (UID: "8db18e94-bcea-4e3a-8759-65fb8084cd43"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.098659 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.098907 4781 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.099020 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.099149 4781 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db18e94-bcea-4e3a-8759-65fb8084cd43-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.099240 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmb26\" (UniqueName: \"kubernetes.io/projected/8db18e94-bcea-4e3a-8759-65fb8084cd43-kube-api-access-dmb26\") on node \"crc\" DevicePath \"\"" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.477534 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" event={"ID":"8db18e94-bcea-4e3a-8759-65fb8084cd43","Type":"ContainerDied","Data":"8db5fd8a0e887c07b3566a0cc22a25506ac7021480e313a16e404e9b1f62a02b"} Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.477588 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8db5fd8a0e887c07b3566a0cc22a25506ac7021480e313a16e404e9b1f62a02b" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.477628 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.586064 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg"] Dec 02 10:06:04 crc kubenswrapper[4781]: E1202 10:06:04.587575 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db18e94-bcea-4e3a-8759-65fb8084cd43" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.587603 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db18e94-bcea-4e3a-8759-65fb8084cd43" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.587833 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db18e94-bcea-4e3a-8759-65fb8084cd43" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.588663 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.592269 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zl624" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.592656 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.592807 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.598915 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.603385 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.603386 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.603669 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.605549 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg"] Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.715656 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.715705 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.715760 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.715980 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.716054 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.716186 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9twh\" (UniqueName: \"kubernetes.io/projected/dacd1d88-ff6e-4719-a24f-4feb0559f463-kube-api-access-m9twh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.716221 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.716250 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.716316 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.817580 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.817847 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9twh\" (UniqueName: \"kubernetes.io/projected/dacd1d88-ff6e-4719-a24f-4feb0559f463-kube-api-access-m9twh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.817871 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.817896 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.817955 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.817992 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.818008 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.818042 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.818095 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.818751 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.822841 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.823236 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.823474 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.824071 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.825208 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.826558 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.835467 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.836261 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9twh\" (UniqueName: \"kubernetes.io/projected/dacd1d88-ff6e-4719-a24f-4feb0559f463-kube-api-access-m9twh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g88rg\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:04 crc kubenswrapper[4781]: I1202 10:06:04.907532 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:06:05 crc kubenswrapper[4781]: I1202 10:06:05.412907 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg"] Dec 02 10:06:05 crc kubenswrapper[4781]: I1202 10:06:05.424354 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 10:06:05 crc kubenswrapper[4781]: I1202 10:06:05.487613 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" event={"ID":"dacd1d88-ff6e-4719-a24f-4feb0559f463","Type":"ContainerStarted","Data":"2e423f8ee36bfa0d17fc1d3433645458c2d2454452c5e54e99de5759f9a961cb"} Dec 02 10:06:06 crc kubenswrapper[4781]: I1202 10:06:06.497631 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" event={"ID":"dacd1d88-ff6e-4719-a24f-4feb0559f463","Type":"ContainerStarted","Data":"dc1aaf482f48e6e1387016fc6196a1ce91bd903952a8056bfc694730dc3043ac"} Dec 02 10:06:06 crc kubenswrapper[4781]: I1202 10:06:06.525416 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" podStartSLOduration=1.986895624 podStartE2EDuration="2.525397437s" podCreationTimestamp="2025-12-02 10:06:04 +0000 UTC" firstStartedPulling="2025-12-02 10:06:05.424055809 +0000 UTC m=+2728.247929688" lastFinishedPulling="2025-12-02 10:06:05.962557622 +0000 UTC m=+2728.786431501" observedRunningTime="2025-12-02 10:06:06.516478141 +0000 UTC m=+2729.340352020" watchObservedRunningTime="2025-12-02 10:06:06.525397437 +0000 UTC m=+2729.349271316" Dec 02 10:07:00 crc kubenswrapper[4781]: I1202 10:07:00.412521 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:07:00 crc kubenswrapper[4781]: I1202 10:07:00.413191 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:07:30 crc kubenswrapper[4781]: I1202 10:07:30.412474 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:07:30 crc kubenswrapper[4781]: I1202 10:07:30.413138 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:08:00 crc kubenswrapper[4781]: I1202 10:08:00.412593 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:08:00 crc kubenswrapper[4781]: I1202 10:08:00.413200 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:08:00 crc kubenswrapper[4781]: I1202 10:08:00.413251 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 10:08:00 crc kubenswrapper[4781]: I1202 10:08:00.414089 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75b74b7096d1dd0ce909b2158eaf421368960aa32dc50742e6efb662b8a4be29"} pod="openshift-machine-config-operator/machine-config-daemon-pzntm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 10:08:00 crc kubenswrapper[4781]: I1202 10:08:00.414187 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" containerID="cri-o://75b74b7096d1dd0ce909b2158eaf421368960aa32dc50742e6efb662b8a4be29" gracePeriod=600 Dec 02 10:08:00 crc kubenswrapper[4781]: I1202 10:08:00.568251 4781 generic.go:334] "Generic (PLEG): container finished" podID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerID="75b74b7096d1dd0ce909b2158eaf421368960aa32dc50742e6efb662b8a4be29" exitCode=0 Dec 02 10:08:00 crc kubenswrapper[4781]: I1202 10:08:00.568309 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerDied","Data":"75b74b7096d1dd0ce909b2158eaf421368960aa32dc50742e6efb662b8a4be29"} Dec 02 10:08:00 crc kubenswrapper[4781]: I1202 10:08:00.568589 4781 scope.go:117] "RemoveContainer" containerID="910ba3e07eab53497d7747baa03c41ace2b0ce4a0abe46eb2c4aa513ae74972d" Dec 02 10:08:01 crc kubenswrapper[4781]: I1202 10:08:01.579115 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea"} Dec 02 10:08:21 crc kubenswrapper[4781]: I1202 10:08:21.839471 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q4z7d"] Dec 02 10:08:21 crc kubenswrapper[4781]: I1202 10:08:21.841759 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q4z7d" Dec 02 10:08:21 crc kubenswrapper[4781]: I1202 10:08:21.876770 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q4z7d"] Dec 02 10:08:21 crc kubenswrapper[4781]: I1202 10:08:21.968737 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xjbm\" (UniqueName: \"kubernetes.io/projected/dd229c63-1675-45d9-8190-6b3adbad8af5-kube-api-access-5xjbm\") pod \"redhat-marketplace-q4z7d\" (UID: \"dd229c63-1675-45d9-8190-6b3adbad8af5\") " pod="openshift-marketplace/redhat-marketplace-q4z7d" Dec 02 10:08:21 crc kubenswrapper[4781]: I1202 10:08:21.969086 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd229c63-1675-45d9-8190-6b3adbad8af5-utilities\") pod \"redhat-marketplace-q4z7d\" (UID: \"dd229c63-1675-45d9-8190-6b3adbad8af5\") " pod="openshift-marketplace/redhat-marketplace-q4z7d" Dec 02 10:08:21 crc kubenswrapper[4781]: I1202 10:08:21.969232 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd229c63-1675-45d9-8190-6b3adbad8af5-catalog-content\") pod \"redhat-marketplace-q4z7d\" (UID: \"dd229c63-1675-45d9-8190-6b3adbad8af5\") " pod="openshift-marketplace/redhat-marketplace-q4z7d" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.071159 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xjbm\" (UniqueName: \"kubernetes.io/projected/dd229c63-1675-45d9-8190-6b3adbad8af5-kube-api-access-5xjbm\") pod \"redhat-marketplace-q4z7d\" (UID: \"dd229c63-1675-45d9-8190-6b3adbad8af5\") " pod="openshift-marketplace/redhat-marketplace-q4z7d" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.071267 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd229c63-1675-45d9-8190-6b3adbad8af5-utilities\") pod \"redhat-marketplace-q4z7d\" (UID: \"dd229c63-1675-45d9-8190-6b3adbad8af5\") " pod="openshift-marketplace/redhat-marketplace-q4z7d" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.071325 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd229c63-1675-45d9-8190-6b3adbad8af5-catalog-content\") pod \"redhat-marketplace-q4z7d\" (UID: \"dd229c63-1675-45d9-8190-6b3adbad8af5\") " pod="openshift-marketplace/redhat-marketplace-q4z7d" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.071852 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd229c63-1675-45d9-8190-6b3adbad8af5-catalog-content\") pod \"redhat-marketplace-q4z7d\" (UID: \"dd229c63-1675-45d9-8190-6b3adbad8af5\") " pod="openshift-marketplace/redhat-marketplace-q4z7d" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.071949 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd229c63-1675-45d9-8190-6b3adbad8af5-utilities\") pod \"redhat-marketplace-q4z7d\" (UID: \"dd229c63-1675-45d9-8190-6b3adbad8af5\") " pod="openshift-marketplace/redhat-marketplace-q4z7d" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.141007 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xjbm\" (UniqueName: \"kubernetes.io/projected/dd229c63-1675-45d9-8190-6b3adbad8af5-kube-api-access-5xjbm\") pod \"redhat-marketplace-q4z7d\" (UID: \"dd229c63-1675-45d9-8190-6b3adbad8af5\") " pod="openshift-marketplace/redhat-marketplace-q4z7d" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.167773 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q4z7d" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.454094 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bjb2z"] Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.456236 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjb2z" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.464943 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bjb2z"] Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.482495 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-utilities\") pod \"redhat-operators-bjb2z\" (UID: \"f815fa8f-85f8-4659-8ecf-70a90acd6d0e\") " pod="openshift-marketplace/redhat-operators-bjb2z" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.482816 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lvvk\" (UniqueName: \"kubernetes.io/projected/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-kube-api-access-4lvvk\") pod \"redhat-operators-bjb2z\" (UID: \"f815fa8f-85f8-4659-8ecf-70a90acd6d0e\") " pod="openshift-marketplace/redhat-operators-bjb2z" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.482978 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-catalog-content\") pod \"redhat-operators-bjb2z\" (UID: \"f815fa8f-85f8-4659-8ecf-70a90acd6d0e\") " pod="openshift-marketplace/redhat-operators-bjb2z" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.584676 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lvvk\" (UniqueName: \"kubernetes.io/projected/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-kube-api-access-4lvvk\") pod \"redhat-operators-bjb2z\" (UID: \"f815fa8f-85f8-4659-8ecf-70a90acd6d0e\") " pod="openshift-marketplace/redhat-operators-bjb2z" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.584790 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-catalog-content\") pod \"redhat-operators-bjb2z\" (UID: \"f815fa8f-85f8-4659-8ecf-70a90acd6d0e\") " pod="openshift-marketplace/redhat-operators-bjb2z" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.584942 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-utilities\") pod \"redhat-operators-bjb2z\" (UID: \"f815fa8f-85f8-4659-8ecf-70a90acd6d0e\") " pod="openshift-marketplace/redhat-operators-bjb2z" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.585820 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-utilities\") pod \"redhat-operators-bjb2z\" (UID: \"f815fa8f-85f8-4659-8ecf-70a90acd6d0e\") " pod="openshift-marketplace/redhat-operators-bjb2z" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.586420 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-catalog-content\") pod \"redhat-operators-bjb2z\" (UID: \"f815fa8f-85f8-4659-8ecf-70a90acd6d0e\") " pod="openshift-marketplace/redhat-operators-bjb2z" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.609557 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lvvk\" (UniqueName: \"kubernetes.io/projected/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-kube-api-access-4lvvk\") pod \"redhat-operators-bjb2z\" (UID: \"f815fa8f-85f8-4659-8ecf-70a90acd6d0e\") " pod="openshift-marketplace/redhat-operators-bjb2z" Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.685645 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q4z7d"] Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.781137 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q4z7d" event={"ID":"dd229c63-1675-45d9-8190-6b3adbad8af5","Type":"ContainerStarted","Data":"c6b3810e62035920b623c3ce231abb476b6e4d324e9b0fbe318e510b9c755145"} Dec 02 10:08:22 crc kubenswrapper[4781]: I1202 10:08:22.785480 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjb2z" Dec 02 10:08:23 crc kubenswrapper[4781]: I1202 10:08:23.233632 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bjb2z"] Dec 02 10:08:23 crc kubenswrapper[4781]: W1202 10:08:23.235436 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf815fa8f_85f8_4659_8ecf_70a90acd6d0e.slice/crio-014295eea2aef912da21ae0a8f238be2b89dbc44b4d7b0efd27dbb8f1785c5bf WatchSource:0}: Error finding container 014295eea2aef912da21ae0a8f238be2b89dbc44b4d7b0efd27dbb8f1785c5bf: Status 404 returned error can't find the container with id 014295eea2aef912da21ae0a8f238be2b89dbc44b4d7b0efd27dbb8f1785c5bf Dec 02 10:08:23 crc kubenswrapper[4781]: I1202 10:08:23.797302 4781 generic.go:334] "Generic (PLEG): container finished" podID="dd229c63-1675-45d9-8190-6b3adbad8af5" containerID="e7df5ca5169848e025f92d73cb0043f25ed1ec1b932ddae63b67be318d8ff0e4" exitCode=0 Dec 02 10:08:23 crc kubenswrapper[4781]: I1202 10:08:23.797519 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q4z7d" event={"ID":"dd229c63-1675-45d9-8190-6b3adbad8af5","Type":"ContainerDied","Data":"e7df5ca5169848e025f92d73cb0043f25ed1ec1b932ddae63b67be318d8ff0e4"} Dec 02 10:08:23 crc kubenswrapper[4781]: I1202 10:08:23.799986 4781 generic.go:334] "Generic (PLEG): container finished" podID="f815fa8f-85f8-4659-8ecf-70a90acd6d0e" containerID="a09a489914b002b5d556ed55478d0c7dfd3993be96f4eaaab0072e8b59250c15" exitCode=0 Dec 02 10:08:23 crc kubenswrapper[4781]: I1202 10:08:23.800028 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjb2z" event={"ID":"f815fa8f-85f8-4659-8ecf-70a90acd6d0e","Type":"ContainerDied","Data":"a09a489914b002b5d556ed55478d0c7dfd3993be96f4eaaab0072e8b59250c15"} Dec 02 10:08:23 crc kubenswrapper[4781]: I1202 10:08:23.800053 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjb2z" event={"ID":"f815fa8f-85f8-4659-8ecf-70a90acd6d0e","Type":"ContainerStarted","Data":"014295eea2aef912da21ae0a8f238be2b89dbc44b4d7b0efd27dbb8f1785c5bf"} Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.228075 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m6frl"] Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.230112 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m6frl" Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.245381 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m6frl"] Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.321240 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvt79\" (UniqueName: \"kubernetes.io/projected/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-kube-api-access-tvt79\") pod \"certified-operators-m6frl\" (UID: \"454ad7ba-1c5e-43c4-9a29-e02e29aa7544\") " pod="openshift-marketplace/certified-operators-m6frl" Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.321302 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-utilities\") pod \"certified-operators-m6frl\" (UID: \"454ad7ba-1c5e-43c4-9a29-e02e29aa7544\") " pod="openshift-marketplace/certified-operators-m6frl" Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.321481 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-catalog-content\") pod \"certified-operators-m6frl\" (UID: \"454ad7ba-1c5e-43c4-9a29-e02e29aa7544\") " pod="openshift-marketplace/certified-operators-m6frl" Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.423559 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-catalog-content\") pod \"certified-operators-m6frl\" (UID: \"454ad7ba-1c5e-43c4-9a29-e02e29aa7544\") " pod="openshift-marketplace/certified-operators-m6frl" Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.423681 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvt79\" (UniqueName: \"kubernetes.io/projected/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-kube-api-access-tvt79\") pod \"certified-operators-m6frl\" (UID: \"454ad7ba-1c5e-43c4-9a29-e02e29aa7544\") " pod="openshift-marketplace/certified-operators-m6frl" Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.423712 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-utilities\") pod \"certified-operators-m6frl\" (UID: \"454ad7ba-1c5e-43c4-9a29-e02e29aa7544\") " pod="openshift-marketplace/certified-operators-m6frl" Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.424249 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-utilities\") pod \"certified-operators-m6frl\" (UID: \"454ad7ba-1c5e-43c4-9a29-e02e29aa7544\") " pod="openshift-marketplace/certified-operators-m6frl" Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.425490 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-catalog-content\") pod \"certified-operators-m6frl\" (UID: \"454ad7ba-1c5e-43c4-9a29-e02e29aa7544\") " pod="openshift-marketplace/certified-operators-m6frl" Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.455326 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvt79\" (UniqueName: \"kubernetes.io/projected/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-kube-api-access-tvt79\") pod \"certified-operators-m6frl\" (UID: \"454ad7ba-1c5e-43c4-9a29-e02e29aa7544\") " pod="openshift-marketplace/certified-operators-m6frl" Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.551702 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m6frl" Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.813046 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjb2z" event={"ID":"f815fa8f-85f8-4659-8ecf-70a90acd6d0e","Type":"ContainerStarted","Data":"9034921533cb74b4038e562c22cbe8c3d5ae551b5c525291550914df0e0b2f23"} Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.841808 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z6gr2"] Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.844325 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6gr2" Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.918980 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z6gr2"] Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.946410 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-utilities\") pod \"community-operators-z6gr2\" (UID: \"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8\") " pod="openshift-marketplace/community-operators-z6gr2" Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.946486 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-catalog-content\") pod \"community-operators-z6gr2\" (UID: \"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8\") " pod="openshift-marketplace/community-operators-z6gr2" Dec 02 10:08:24 crc kubenswrapper[4781]: I1202 10:08:24.946578 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86w6x\" (UniqueName: \"kubernetes.io/projected/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-kube-api-access-86w6x\") pod \"community-operators-z6gr2\" (UID: \"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8\") " pod="openshift-marketplace/community-operators-z6gr2" Dec 02 10:08:25 crc kubenswrapper[4781]: I1202 10:08:25.048913 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-catalog-content\") pod \"community-operators-z6gr2\" (UID: \"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8\") " pod="openshift-marketplace/community-operators-z6gr2" Dec 02 10:08:25 crc kubenswrapper[4781]: I1202 10:08:25.049096 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86w6x\" (UniqueName: \"kubernetes.io/projected/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-kube-api-access-86w6x\") pod \"community-operators-z6gr2\" (UID: \"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8\") " pod="openshift-marketplace/community-operators-z6gr2" Dec 02 10:08:25 crc kubenswrapper[4781]: I1202 10:08:25.049176 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-utilities\") pod \"community-operators-z6gr2\" (UID: \"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8\") " pod="openshift-marketplace/community-operators-z6gr2" Dec 02 10:08:25 crc kubenswrapper[4781]: I1202 10:08:25.052229 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-utilities\") pod \"community-operators-z6gr2\" (UID: \"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8\") " pod="openshift-marketplace/community-operators-z6gr2" Dec 02 10:08:25 crc kubenswrapper[4781]: I1202 10:08:25.052291 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-catalog-content\") pod \"community-operators-z6gr2\" (UID: \"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8\") " pod="openshift-marketplace/community-operators-z6gr2" Dec 02 10:08:25 crc kubenswrapper[4781]: I1202 10:08:25.082289 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m6frl"] Dec 02 10:08:25 crc kubenswrapper[4781]: I1202 10:08:25.100706 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86w6x\" (UniqueName: \"kubernetes.io/projected/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-kube-api-access-86w6x\") pod \"community-operators-z6gr2\" (UID: \"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8\") " pod="openshift-marketplace/community-operators-z6gr2" Dec 02 10:08:25 crc kubenswrapper[4781]: I1202 10:08:25.187080 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6gr2" Dec 02 10:08:25 crc kubenswrapper[4781]: I1202 10:08:25.693721 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z6gr2"] Dec 02 10:08:25 crc kubenswrapper[4781]: W1202 10:08:25.698620 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5f10cf6_3aed_4581_a2a6_bf8d6690aeb8.slice/crio-688968b63b2b027d6ab0941e27341ecb309b0a76cc230cc789bc192ed97d499c WatchSource:0}: Error finding container 688968b63b2b027d6ab0941e27341ecb309b0a76cc230cc789bc192ed97d499c: Status 404 returned error can't find the container with id 688968b63b2b027d6ab0941e27341ecb309b0a76cc230cc789bc192ed97d499c Dec 02 10:08:25 crc kubenswrapper[4781]: I1202 10:08:25.822813 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6gr2" event={"ID":"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8","Type":"ContainerStarted","Data":"688968b63b2b027d6ab0941e27341ecb309b0a76cc230cc789bc192ed97d499c"} Dec 02 10:08:25 crc kubenswrapper[4781]: I1202 10:08:25.825151 4781 generic.go:334] "Generic (PLEG): container finished" podID="dd229c63-1675-45d9-8190-6b3adbad8af5" containerID="cf89a335ee7144636a3922c4e4251d75b4dbe0c38979b20ef43456efd0618b39" exitCode=0 Dec 02 10:08:25 crc kubenswrapper[4781]: I1202 10:08:25.825204 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q4z7d" event={"ID":"dd229c63-1675-45d9-8190-6b3adbad8af5","Type":"ContainerDied","Data":"cf89a335ee7144636a3922c4e4251d75b4dbe0c38979b20ef43456efd0618b39"} Dec 02 10:08:25 crc kubenswrapper[4781]: I1202 10:08:25.827668 4781 generic.go:334] "Generic (PLEG): container finished" podID="f815fa8f-85f8-4659-8ecf-70a90acd6d0e" containerID="9034921533cb74b4038e562c22cbe8c3d5ae551b5c525291550914df0e0b2f23" exitCode=0 Dec 02 10:08:25 crc kubenswrapper[4781]: I1202 10:08:25.827748 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjb2z" event={"ID":"f815fa8f-85f8-4659-8ecf-70a90acd6d0e","Type":"ContainerDied","Data":"9034921533cb74b4038e562c22cbe8c3d5ae551b5c525291550914df0e0b2f23"} Dec 02 10:08:25 crc kubenswrapper[4781]: I1202 10:08:25.829358 4781 generic.go:334] "Generic (PLEG): container finished" podID="454ad7ba-1c5e-43c4-9a29-e02e29aa7544" containerID="f44ae5ad0bd0c521d4b436106c661ad21f0e6b9e70cf548fff54ca0b4b0d003c" exitCode=0 Dec 02 10:08:25 crc kubenswrapper[4781]: I1202 10:08:25.829383 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6frl" event={"ID":"454ad7ba-1c5e-43c4-9a29-e02e29aa7544","Type":"ContainerDied","Data":"f44ae5ad0bd0c521d4b436106c661ad21f0e6b9e70cf548fff54ca0b4b0d003c"} Dec 02 10:08:25 crc kubenswrapper[4781]: I1202 10:08:25.829396 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6frl" event={"ID":"454ad7ba-1c5e-43c4-9a29-e02e29aa7544","Type":"ContainerStarted","Data":"0123c20da39221c791b4c745fc5ffe53a61d375dfa4f7509d81127006ab175d1"} Dec 02 10:08:26 crc kubenswrapper[4781]: I1202 10:08:26.844574 4781 generic.go:334] "Generic (PLEG): container finished" podID="e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8" containerID="c535f40b3a1193244d237c7b97d7c6e2ee00f81d134562bf74e6c5e24f1f130c" exitCode=0 Dec 02 10:08:26 crc kubenswrapper[4781]: I1202 10:08:26.846212 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6gr2" event={"ID":"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8","Type":"ContainerDied","Data":"c535f40b3a1193244d237c7b97d7c6e2ee00f81d134562bf74e6c5e24f1f130c"} Dec 02 10:08:26 crc kubenswrapper[4781]: I1202 10:08:26.856719 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q4z7d" event={"ID":"dd229c63-1675-45d9-8190-6b3adbad8af5","Type":"ContainerStarted","Data":"e6e3c643325e369259253e922f3aed36d055a7c165f09430f2768018a90469f8"} Dec 02 10:08:26 crc kubenswrapper[4781]: I1202 10:08:26.877334 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjb2z" event={"ID":"f815fa8f-85f8-4659-8ecf-70a90acd6d0e","Type":"ContainerStarted","Data":"bd4d8172f5a1e393a4d53af90d944e618925ec495c54c0c9368ad802c15ba304"} Dec 02 10:08:26 crc kubenswrapper[4781]: I1202 10:08:26.892995 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q4z7d" podStartSLOduration=3.359864313 podStartE2EDuration="5.892893516s" podCreationTimestamp="2025-12-02 10:08:21 +0000 UTC" firstStartedPulling="2025-12-02 10:08:23.799420382 +0000 UTC m=+2866.623294261" lastFinishedPulling="2025-12-02 10:08:26.332449585 +0000 UTC m=+2869.156323464" observedRunningTime="2025-12-02 10:08:26.889878976 +0000 UTC m=+2869.713752855" watchObservedRunningTime="2025-12-02 10:08:26.892893516 +0000 UTC m=+2869.716767395" Dec 02 10:08:26 crc kubenswrapper[4781]: I1202 10:08:26.909004 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bjb2z" podStartSLOduration=2.235228078 podStartE2EDuration="4.908977189s" podCreationTimestamp="2025-12-02 10:08:22 +0000 UTC" firstStartedPulling="2025-12-02 10:08:23.801335053 +0000 UTC m=+2866.625208932" lastFinishedPulling="2025-12-02 10:08:26.475084164 +0000 UTC m=+2869.298958043" observedRunningTime="2025-12-02 10:08:26.905615419 +0000 UTC m=+2869.729489298" watchObservedRunningTime="2025-12-02 10:08:26.908977189 +0000 UTC m=+2869.732851068" Dec 02 10:08:27 crc kubenswrapper[4781]: I1202 10:08:27.889521 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6frl" event={"ID":"454ad7ba-1c5e-43c4-9a29-e02e29aa7544","Type":"ContainerStarted","Data":"3d1e06865ec94fcbc2a74b1d222e036d48104c4aa2673f65886b6caf6038c2a8"} Dec 02 10:08:28 crc kubenswrapper[4781]: I1202 10:08:28.901806 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6gr2" event={"ID":"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8","Type":"ContainerStarted","Data":"e3dfd3e5229d29d8eb5767450f520550806d8883135bb29d6022c3ade04f591c"} Dec 02 10:08:28 crc kubenswrapper[4781]: I1202 10:08:28.903856 4781 generic.go:334] "Generic (PLEG): container finished" podID="454ad7ba-1c5e-43c4-9a29-e02e29aa7544" containerID="3d1e06865ec94fcbc2a74b1d222e036d48104c4aa2673f65886b6caf6038c2a8" exitCode=0 Dec 02 10:08:28 crc kubenswrapper[4781]: I1202 10:08:28.903881 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6frl" event={"ID":"454ad7ba-1c5e-43c4-9a29-e02e29aa7544","Type":"ContainerDied","Data":"3d1e06865ec94fcbc2a74b1d222e036d48104c4aa2673f65886b6caf6038c2a8"} Dec 02 10:08:30 crc kubenswrapper[4781]: I1202 10:08:30.921861 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6frl" event={"ID":"454ad7ba-1c5e-43c4-9a29-e02e29aa7544","Type":"ContainerStarted","Data":"2cfd2759c5ac7b1177bb42535b9f8d6687cbc3daa425141c954800f25f71b2ec"} Dec 02 10:08:30 crc kubenswrapper[4781]: I1202 10:08:30.944469 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m6frl" podStartSLOduration=3.029914173 podStartE2EDuration="6.944450173s" podCreationTimestamp="2025-12-02 10:08:24 +0000 UTC" firstStartedPulling="2025-12-02 10:08:25.830307422 +0000 UTC m=+2868.654181301" lastFinishedPulling="2025-12-02 10:08:29.744843422 +0000 UTC m=+2872.568717301" observedRunningTime="2025-12-02 10:08:30.940665611 +0000 UTC m=+2873.764539500" watchObservedRunningTime="2025-12-02 10:08:30.944450173 +0000 UTC m=+2873.768324052" Dec 02 10:08:31 crc kubenswrapper[4781]: I1202 10:08:31.934101 4781 generic.go:334] "Generic (PLEG): container finished" podID="e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8" containerID="e3dfd3e5229d29d8eb5767450f520550806d8883135bb29d6022c3ade04f591c" exitCode=0 Dec 02 10:08:31 crc kubenswrapper[4781]: I1202 10:08:31.934256 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6gr2" event={"ID":"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8","Type":"ContainerDied","Data":"e3dfd3e5229d29d8eb5767450f520550806d8883135bb29d6022c3ade04f591c"} Dec 02 10:08:32 crc kubenswrapper[4781]: I1202 10:08:32.169570 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q4z7d" Dec 02 10:08:32 crc kubenswrapper[4781]: I1202 10:08:32.169880 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q4z7d" Dec 02 10:08:32 crc kubenswrapper[4781]: I1202 10:08:32.219009 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q4z7d" Dec 02 10:08:32 crc kubenswrapper[4781]: I1202 10:08:32.786820 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bjb2z" Dec 02 10:08:32 crc kubenswrapper[4781]: I1202 10:08:32.786879 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bjb2z" Dec 02 10:08:32 crc kubenswrapper[4781]: I1202 10:08:32.987302 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q4z7d" Dec 02 10:08:34 crc kubenswrapper[4781]: I1202 10:08:34.151580 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bjb2z" podUID="f815fa8f-85f8-4659-8ecf-70a90acd6d0e" containerName="registry-server" probeResult="failure" output=< Dec 02 10:08:34 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Dec 02 10:08:34 crc kubenswrapper[4781]: > Dec 02 10:08:34 crc kubenswrapper[4781]: I1202 10:08:34.552731 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m6frl" Dec 02 10:08:34 crc kubenswrapper[4781]: I1202 10:08:34.554145 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m6frl" Dec 02 10:08:34 crc kubenswrapper[4781]: I1202 10:08:34.594663 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m6frl" Dec 02 10:08:35 crc kubenswrapper[4781]: I1202 10:08:35.216504 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m6frl" Dec 02 10:08:35 crc kubenswrapper[4781]: I1202 10:08:35.227534 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q4z7d"] Dec 02 10:08:35 crc kubenswrapper[4781]: I1202 10:08:35.227819 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q4z7d" podUID="dd229c63-1675-45d9-8190-6b3adbad8af5" containerName="registry-server" containerID="cri-o://e6e3c643325e369259253e922f3aed36d055a7c165f09430f2768018a90469f8" gracePeriod=2 Dec 02 10:08:35 crc kubenswrapper[4781]: I1202 10:08:35.920015 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q4z7d" Dec 02 10:08:35 crc kubenswrapper[4781]: I1202 10:08:35.987754 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd229c63-1675-45d9-8190-6b3adbad8af5-utilities\") pod \"dd229c63-1675-45d9-8190-6b3adbad8af5\" (UID: \"dd229c63-1675-45d9-8190-6b3adbad8af5\") " Dec 02 10:08:35 crc kubenswrapper[4781]: I1202 10:08:35.988328 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xjbm\" (UniqueName: \"kubernetes.io/projected/dd229c63-1675-45d9-8190-6b3adbad8af5-kube-api-access-5xjbm\") pod \"dd229c63-1675-45d9-8190-6b3adbad8af5\" (UID: \"dd229c63-1675-45d9-8190-6b3adbad8af5\") " Dec 02 10:08:35 crc kubenswrapper[4781]: I1202 10:08:35.988398 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd229c63-1675-45d9-8190-6b3adbad8af5-catalog-content\") pod \"dd229c63-1675-45d9-8190-6b3adbad8af5\" (UID: \"dd229c63-1675-45d9-8190-6b3adbad8af5\") " Dec 02 10:08:35 crc kubenswrapper[4781]: I1202 10:08:35.988460 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd229c63-1675-45d9-8190-6b3adbad8af5-utilities" (OuterVolumeSpecName: "utilities") pod "dd229c63-1675-45d9-8190-6b3adbad8af5" (UID: "dd229c63-1675-45d9-8190-6b3adbad8af5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:08:35 crc kubenswrapper[4781]: I1202 10:08:35.988844 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd229c63-1675-45d9-8190-6b3adbad8af5-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:35 crc kubenswrapper[4781]: I1202 10:08:35.994201 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd229c63-1675-45d9-8190-6b3adbad8af5-kube-api-access-5xjbm" (OuterVolumeSpecName: "kube-api-access-5xjbm") pod "dd229c63-1675-45d9-8190-6b3adbad8af5" (UID: "dd229c63-1675-45d9-8190-6b3adbad8af5"). InnerVolumeSpecName "kube-api-access-5xjbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.008857 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd229c63-1675-45d9-8190-6b3adbad8af5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd229c63-1675-45d9-8190-6b3adbad8af5" (UID: "dd229c63-1675-45d9-8190-6b3adbad8af5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.090315 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xjbm\" (UniqueName: \"kubernetes.io/projected/dd229c63-1675-45d9-8190-6b3adbad8af5-kube-api-access-5xjbm\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.090361 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd229c63-1675-45d9-8190-6b3adbad8af5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.177511 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6gr2" event={"ID":"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8","Type":"ContainerStarted","Data":"1a5cb7345489ed224678931c9bdf79b12433500b3e6a1250c521dbf8cccbfad6"} Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.179814 4781 generic.go:334] "Generic (PLEG): container finished" podID="dd229c63-1675-45d9-8190-6b3adbad8af5" containerID="e6e3c643325e369259253e922f3aed36d055a7c165f09430f2768018a90469f8" exitCode=0 Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.179876 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q4z7d" event={"ID":"dd229c63-1675-45d9-8190-6b3adbad8af5","Type":"ContainerDied","Data":"e6e3c643325e369259253e922f3aed36d055a7c165f09430f2768018a90469f8"} Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.179966 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q4z7d" event={"ID":"dd229c63-1675-45d9-8190-6b3adbad8af5","Type":"ContainerDied","Data":"c6b3810e62035920b623c3ce231abb476b6e4d324e9b0fbe318e510b9c755145"} Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.179992 4781 scope.go:117] "RemoveContainer" containerID="e6e3c643325e369259253e922f3aed36d055a7c165f09430f2768018a90469f8" Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.180114 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q4z7d" Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.202554 4781 scope.go:117] "RemoveContainer" containerID="cf89a335ee7144636a3922c4e4251d75b4dbe0c38979b20ef43456efd0618b39" Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.218384 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z6gr2" podStartSLOduration=3.517017531 podStartE2EDuration="12.218363973s" podCreationTimestamp="2025-12-02 10:08:24 +0000 UTC" firstStartedPulling="2025-12-02 10:08:26.851245545 +0000 UTC m=+2869.675119425" lastFinishedPulling="2025-12-02 10:08:35.552591978 +0000 UTC m=+2878.376465867" observedRunningTime="2025-12-02 10:08:36.197759549 +0000 UTC m=+2879.021633438" watchObservedRunningTime="2025-12-02 10:08:36.218363973 +0000 UTC m=+2879.042237852" Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.228469 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q4z7d"] Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.238594 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q4z7d"] Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.239766 4781 scope.go:117] "RemoveContainer" containerID="e7df5ca5169848e025f92d73cb0043f25ed1ec1b932ddae63b67be318d8ff0e4" Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.257647 4781 scope.go:117] "RemoveContainer" containerID="e6e3c643325e369259253e922f3aed36d055a7c165f09430f2768018a90469f8" Dec 02 10:08:36 crc kubenswrapper[4781]: E1202 10:08:36.258511 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6e3c643325e369259253e922f3aed36d055a7c165f09430f2768018a90469f8\": container with ID starting with e6e3c643325e369259253e922f3aed36d055a7c165f09430f2768018a90469f8 not found: ID does not exist" containerID="e6e3c643325e369259253e922f3aed36d055a7c165f09430f2768018a90469f8" Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.258817 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6e3c643325e369259253e922f3aed36d055a7c165f09430f2768018a90469f8"} err="failed to get container status \"e6e3c643325e369259253e922f3aed36d055a7c165f09430f2768018a90469f8\": rpc error: code = NotFound desc = could not find container \"e6e3c643325e369259253e922f3aed36d055a7c165f09430f2768018a90469f8\": container with ID starting with e6e3c643325e369259253e922f3aed36d055a7c165f09430f2768018a90469f8 not found: ID does not exist" Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.258846 4781 scope.go:117] "RemoveContainer" containerID="cf89a335ee7144636a3922c4e4251d75b4dbe0c38979b20ef43456efd0618b39" Dec 02 10:08:36 crc kubenswrapper[4781]: E1202 10:08:36.259237 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf89a335ee7144636a3922c4e4251d75b4dbe0c38979b20ef43456efd0618b39\": container with ID starting with cf89a335ee7144636a3922c4e4251d75b4dbe0c38979b20ef43456efd0618b39 not found: ID does not exist" containerID="cf89a335ee7144636a3922c4e4251d75b4dbe0c38979b20ef43456efd0618b39" Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.259272 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf89a335ee7144636a3922c4e4251d75b4dbe0c38979b20ef43456efd0618b39"} err="failed to get container status \"cf89a335ee7144636a3922c4e4251d75b4dbe0c38979b20ef43456efd0618b39\": rpc error: code = NotFound desc = could not find container \"cf89a335ee7144636a3922c4e4251d75b4dbe0c38979b20ef43456efd0618b39\": container with ID starting with cf89a335ee7144636a3922c4e4251d75b4dbe0c38979b20ef43456efd0618b39 not found: ID does not exist" Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.259299 4781 scope.go:117] "RemoveContainer" containerID="e7df5ca5169848e025f92d73cb0043f25ed1ec1b932ddae63b67be318d8ff0e4" Dec 02 10:08:36 crc kubenswrapper[4781]: E1202 10:08:36.259720 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7df5ca5169848e025f92d73cb0043f25ed1ec1b932ddae63b67be318d8ff0e4\": container with ID starting with e7df5ca5169848e025f92d73cb0043f25ed1ec1b932ddae63b67be318d8ff0e4 not found: ID does not exist" containerID="e7df5ca5169848e025f92d73cb0043f25ed1ec1b932ddae63b67be318d8ff0e4" Dec 02 10:08:36 crc kubenswrapper[4781]: I1202 10:08:36.259769 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7df5ca5169848e025f92d73cb0043f25ed1ec1b932ddae63b67be318d8ff0e4"} err="failed to get container status \"e7df5ca5169848e025f92d73cb0043f25ed1ec1b932ddae63b67be318d8ff0e4\": rpc error: code = NotFound desc = could not find container \"e7df5ca5169848e025f92d73cb0043f25ed1ec1b932ddae63b67be318d8ff0e4\": container with ID starting with e7df5ca5169848e025f92d73cb0043f25ed1ec1b932ddae63b67be318d8ff0e4 not found: ID does not exist" Dec 02 10:08:37 crc kubenswrapper[4781]: I1202 10:08:37.515943 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd229c63-1675-45d9-8190-6b3adbad8af5" path="/var/lib/kubelet/pods/dd229c63-1675-45d9-8190-6b3adbad8af5/volumes" Dec 02 10:08:37 crc kubenswrapper[4781]: I1202 10:08:37.622056 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m6frl"] Dec 02 10:08:38 crc kubenswrapper[4781]: I1202 10:08:38.204827 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m6frl" podUID="454ad7ba-1c5e-43c4-9a29-e02e29aa7544" containerName="registry-server" containerID="cri-o://2cfd2759c5ac7b1177bb42535b9f8d6687cbc3daa425141c954800f25f71b2ec" gracePeriod=2 Dec 02 10:08:38 crc kubenswrapper[4781]: I1202 10:08:38.671464 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m6frl" Dec 02 10:08:38 crc kubenswrapper[4781]: I1202 10:08:38.772086 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvt79\" (UniqueName: \"kubernetes.io/projected/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-kube-api-access-tvt79\") pod \"454ad7ba-1c5e-43c4-9a29-e02e29aa7544\" (UID: \"454ad7ba-1c5e-43c4-9a29-e02e29aa7544\") " Dec 02 10:08:38 crc kubenswrapper[4781]: I1202 10:08:38.772186 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-catalog-content\") pod \"454ad7ba-1c5e-43c4-9a29-e02e29aa7544\" (UID: \"454ad7ba-1c5e-43c4-9a29-e02e29aa7544\") " Dec 02 10:08:38 crc kubenswrapper[4781]: I1202 10:08:38.772403 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-utilities\") pod \"454ad7ba-1c5e-43c4-9a29-e02e29aa7544\" (UID: \"454ad7ba-1c5e-43c4-9a29-e02e29aa7544\") " Dec 02 10:08:38 crc kubenswrapper[4781]: I1202 10:08:38.772795 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-utilities" (OuterVolumeSpecName: "utilities") pod "454ad7ba-1c5e-43c4-9a29-e02e29aa7544" (UID: "454ad7ba-1c5e-43c4-9a29-e02e29aa7544"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:08:38 crc kubenswrapper[4781]: I1202 10:08:38.773068 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:38 crc kubenswrapper[4781]: I1202 10:08:38.778550 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-kube-api-access-tvt79" (OuterVolumeSpecName: "kube-api-access-tvt79") pod "454ad7ba-1c5e-43c4-9a29-e02e29aa7544" (UID: "454ad7ba-1c5e-43c4-9a29-e02e29aa7544"). InnerVolumeSpecName "kube-api-access-tvt79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:08:38 crc kubenswrapper[4781]: I1202 10:08:38.819877 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "454ad7ba-1c5e-43c4-9a29-e02e29aa7544" (UID: "454ad7ba-1c5e-43c4-9a29-e02e29aa7544"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:08:38 crc kubenswrapper[4781]: I1202 10:08:38.875327 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvt79\" (UniqueName: \"kubernetes.io/projected/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-kube-api-access-tvt79\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:38 crc kubenswrapper[4781]: I1202 10:08:38.875369 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454ad7ba-1c5e-43c4-9a29-e02e29aa7544-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:39 crc kubenswrapper[4781]: I1202 10:08:39.216578 4781 generic.go:334] "Generic (PLEG): container finished" podID="454ad7ba-1c5e-43c4-9a29-e02e29aa7544" containerID="2cfd2759c5ac7b1177bb42535b9f8d6687cbc3daa425141c954800f25f71b2ec" exitCode=0 Dec 02 10:08:39 crc kubenswrapper[4781]: I1202 10:08:39.216619 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6frl" event={"ID":"454ad7ba-1c5e-43c4-9a29-e02e29aa7544","Type":"ContainerDied","Data":"2cfd2759c5ac7b1177bb42535b9f8d6687cbc3daa425141c954800f25f71b2ec"} Dec 02 10:08:39 crc kubenswrapper[4781]: I1202 10:08:39.216841 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6frl" event={"ID":"454ad7ba-1c5e-43c4-9a29-e02e29aa7544","Type":"ContainerDied","Data":"0123c20da39221c791b4c745fc5ffe53a61d375dfa4f7509d81127006ab175d1"} Dec 02 10:08:39 crc kubenswrapper[4781]: I1202 10:08:39.216860 4781 scope.go:117] "RemoveContainer" containerID="2cfd2759c5ac7b1177bb42535b9f8d6687cbc3daa425141c954800f25f71b2ec" Dec 02 10:08:39 crc kubenswrapper[4781]: I1202 10:08:39.217261 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m6frl" Dec 02 10:08:39 crc kubenswrapper[4781]: I1202 10:08:39.237125 4781 scope.go:117] "RemoveContainer" containerID="3d1e06865ec94fcbc2a74b1d222e036d48104c4aa2673f65886b6caf6038c2a8" Dec 02 10:08:39 crc kubenswrapper[4781]: I1202 10:08:39.252148 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m6frl"] Dec 02 10:08:39 crc kubenswrapper[4781]: I1202 10:08:39.260442 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m6frl"] Dec 02 10:08:39 crc kubenswrapper[4781]: I1202 10:08:39.277651 4781 scope.go:117] "RemoveContainer" containerID="f44ae5ad0bd0c521d4b436106c661ad21f0e6b9e70cf548fff54ca0b4b0d003c" Dec 02 10:08:39 crc kubenswrapper[4781]: I1202 10:08:39.317974 4781 scope.go:117] "RemoveContainer" containerID="2cfd2759c5ac7b1177bb42535b9f8d6687cbc3daa425141c954800f25f71b2ec" Dec 02 10:08:39 crc kubenswrapper[4781]: E1202 10:08:39.319163 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cfd2759c5ac7b1177bb42535b9f8d6687cbc3daa425141c954800f25f71b2ec\": container with ID starting with 2cfd2759c5ac7b1177bb42535b9f8d6687cbc3daa425141c954800f25f71b2ec not found: ID does not exist" containerID="2cfd2759c5ac7b1177bb42535b9f8d6687cbc3daa425141c954800f25f71b2ec" Dec 02 10:08:39 crc kubenswrapper[4781]: I1202 10:08:39.319214 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cfd2759c5ac7b1177bb42535b9f8d6687cbc3daa425141c954800f25f71b2ec"} err="failed to get container status \"2cfd2759c5ac7b1177bb42535b9f8d6687cbc3daa425141c954800f25f71b2ec\": rpc error: code = NotFound desc = could not find container \"2cfd2759c5ac7b1177bb42535b9f8d6687cbc3daa425141c954800f25f71b2ec\": container with ID starting with 2cfd2759c5ac7b1177bb42535b9f8d6687cbc3daa425141c954800f25f71b2ec not found: ID does not exist" Dec 02 10:08:39 crc kubenswrapper[4781]: I1202 10:08:39.319246 4781 scope.go:117] "RemoveContainer" containerID="3d1e06865ec94fcbc2a74b1d222e036d48104c4aa2673f65886b6caf6038c2a8" Dec 02 10:08:39 crc kubenswrapper[4781]: E1202 10:08:39.319596 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d1e06865ec94fcbc2a74b1d222e036d48104c4aa2673f65886b6caf6038c2a8\": container with ID starting with 3d1e06865ec94fcbc2a74b1d222e036d48104c4aa2673f65886b6caf6038c2a8 not found: ID does not exist" containerID="3d1e06865ec94fcbc2a74b1d222e036d48104c4aa2673f65886b6caf6038c2a8" Dec 02 10:08:39 crc kubenswrapper[4781]: I1202 10:08:39.319626 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d1e06865ec94fcbc2a74b1d222e036d48104c4aa2673f65886b6caf6038c2a8"} err="failed to get container status \"3d1e06865ec94fcbc2a74b1d222e036d48104c4aa2673f65886b6caf6038c2a8\": rpc error: code = NotFound desc = could not find container \"3d1e06865ec94fcbc2a74b1d222e036d48104c4aa2673f65886b6caf6038c2a8\": container with ID starting with 3d1e06865ec94fcbc2a74b1d222e036d48104c4aa2673f65886b6caf6038c2a8 not found: ID does not exist" Dec 02 10:08:39 crc kubenswrapper[4781]: I1202 10:08:39.319648 4781 scope.go:117] "RemoveContainer" containerID="f44ae5ad0bd0c521d4b436106c661ad21f0e6b9e70cf548fff54ca0b4b0d003c" Dec 02 10:08:39 crc kubenswrapper[4781]: E1202 10:08:39.319882 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f44ae5ad0bd0c521d4b436106c661ad21f0e6b9e70cf548fff54ca0b4b0d003c\": container with ID starting with f44ae5ad0bd0c521d4b436106c661ad21f0e6b9e70cf548fff54ca0b4b0d003c not found: ID does not exist" containerID="f44ae5ad0bd0c521d4b436106c661ad21f0e6b9e70cf548fff54ca0b4b0d003c" Dec 02 10:08:39 crc kubenswrapper[4781]: I1202 10:08:39.319938 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f44ae5ad0bd0c521d4b436106c661ad21f0e6b9e70cf548fff54ca0b4b0d003c"} err="failed to get container status \"f44ae5ad0bd0c521d4b436106c661ad21f0e6b9e70cf548fff54ca0b4b0d003c\": rpc error: code = NotFound desc = could not find container \"f44ae5ad0bd0c521d4b436106c661ad21f0e6b9e70cf548fff54ca0b4b0d003c\": container with ID starting with f44ae5ad0bd0c521d4b436106c661ad21f0e6b9e70cf548fff54ca0b4b0d003c not found: ID does not exist" Dec 02 10:08:39 crc kubenswrapper[4781]: I1202 10:08:39.510020 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="454ad7ba-1c5e-43c4-9a29-e02e29aa7544" path="/var/lib/kubelet/pods/454ad7ba-1c5e-43c4-9a29-e02e29aa7544/volumes" Dec 02 10:08:42 crc kubenswrapper[4781]: I1202 10:08:42.841141 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bjb2z" Dec 02 10:08:42 crc kubenswrapper[4781]: I1202 10:08:42.895797 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bjb2z" Dec 02 10:08:43 crc kubenswrapper[4781]: I1202 10:08:43.227160 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bjb2z"] Dec 02 10:08:44 crc kubenswrapper[4781]: I1202 10:08:44.263451 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bjb2z" podUID="f815fa8f-85f8-4659-8ecf-70a90acd6d0e" containerName="registry-server" containerID="cri-o://bd4d8172f5a1e393a4d53af90d944e618925ec495c54c0c9368ad802c15ba304" gracePeriod=2 Dec 02 10:08:44 crc kubenswrapper[4781]: I1202 10:08:44.765565 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjb2z" Dec 02 10:08:44 crc kubenswrapper[4781]: I1202 10:08:44.940390 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lvvk\" (UniqueName: \"kubernetes.io/projected/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-kube-api-access-4lvvk\") pod \"f815fa8f-85f8-4659-8ecf-70a90acd6d0e\" (UID: \"f815fa8f-85f8-4659-8ecf-70a90acd6d0e\") " Dec 02 10:08:44 crc kubenswrapper[4781]: I1202 10:08:44.940533 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-catalog-content\") pod \"f815fa8f-85f8-4659-8ecf-70a90acd6d0e\" (UID: \"f815fa8f-85f8-4659-8ecf-70a90acd6d0e\") " Dec 02 10:08:44 crc kubenswrapper[4781]: I1202 10:08:44.940600 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-utilities\") pod \"f815fa8f-85f8-4659-8ecf-70a90acd6d0e\" (UID: \"f815fa8f-85f8-4659-8ecf-70a90acd6d0e\") " Dec 02 10:08:44 crc kubenswrapper[4781]: I1202 10:08:44.941350 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-utilities" (OuterVolumeSpecName: "utilities") pod "f815fa8f-85f8-4659-8ecf-70a90acd6d0e" (UID: "f815fa8f-85f8-4659-8ecf-70a90acd6d0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:08:44 crc kubenswrapper[4781]: I1202 10:08:44.946168 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-kube-api-access-4lvvk" (OuterVolumeSpecName: "kube-api-access-4lvvk") pod "f815fa8f-85f8-4659-8ecf-70a90acd6d0e" (UID: "f815fa8f-85f8-4659-8ecf-70a90acd6d0e"). InnerVolumeSpecName "kube-api-access-4lvvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.038129 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f815fa8f-85f8-4659-8ecf-70a90acd6d0e" (UID: "f815fa8f-85f8-4659-8ecf-70a90acd6d0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.042426 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lvvk\" (UniqueName: \"kubernetes.io/projected/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-kube-api-access-4lvvk\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.042597 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.042683 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f815fa8f-85f8-4659-8ecf-70a90acd6d0e-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.187943 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z6gr2" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.188305 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z6gr2" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.241545 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z6gr2" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.278843 4781 generic.go:334] "Generic (PLEG): container finished" podID="f815fa8f-85f8-4659-8ecf-70a90acd6d0e" containerID="bd4d8172f5a1e393a4d53af90d944e618925ec495c54c0c9368ad802c15ba304" exitCode=0 Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.279997 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjb2z" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.285213 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjb2z" event={"ID":"f815fa8f-85f8-4659-8ecf-70a90acd6d0e","Type":"ContainerDied","Data":"bd4d8172f5a1e393a4d53af90d944e618925ec495c54c0c9368ad802c15ba304"} Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.285311 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjb2z" event={"ID":"f815fa8f-85f8-4659-8ecf-70a90acd6d0e","Type":"ContainerDied","Data":"014295eea2aef912da21ae0a8f238be2b89dbc44b4d7b0efd27dbb8f1785c5bf"} Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.285349 4781 scope.go:117] "RemoveContainer" containerID="bd4d8172f5a1e393a4d53af90d944e618925ec495c54c0c9368ad802c15ba304" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.315981 4781 scope.go:117] "RemoveContainer" containerID="9034921533cb74b4038e562c22cbe8c3d5ae551b5c525291550914df0e0b2f23" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.330048 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bjb2z"] Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.337211 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bjb2z"] Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.376527 4781 scope.go:117] "RemoveContainer" containerID="a09a489914b002b5d556ed55478d0c7dfd3993be96f4eaaab0072e8b59250c15" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.383322 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z6gr2" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.404772 4781 scope.go:117] "RemoveContainer" containerID="bd4d8172f5a1e393a4d53af90d944e618925ec495c54c0c9368ad802c15ba304" Dec 02 10:08:45 crc kubenswrapper[4781]: E1202 10:08:45.405249 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd4d8172f5a1e393a4d53af90d944e618925ec495c54c0c9368ad802c15ba304\": container with ID starting with bd4d8172f5a1e393a4d53af90d944e618925ec495c54c0c9368ad802c15ba304 not found: ID does not exist" containerID="bd4d8172f5a1e393a4d53af90d944e618925ec495c54c0c9368ad802c15ba304" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.405282 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd4d8172f5a1e393a4d53af90d944e618925ec495c54c0c9368ad802c15ba304"} err="failed to get container status \"bd4d8172f5a1e393a4d53af90d944e618925ec495c54c0c9368ad802c15ba304\": rpc error: code = NotFound desc = could not find container \"bd4d8172f5a1e393a4d53af90d944e618925ec495c54c0c9368ad802c15ba304\": container with ID starting with bd4d8172f5a1e393a4d53af90d944e618925ec495c54c0c9368ad802c15ba304 not found: ID does not exist" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.405308 4781 scope.go:117] "RemoveContainer" containerID="9034921533cb74b4038e562c22cbe8c3d5ae551b5c525291550914df0e0b2f23" Dec 02 10:08:45 crc kubenswrapper[4781]: E1202 10:08:45.405642 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9034921533cb74b4038e562c22cbe8c3d5ae551b5c525291550914df0e0b2f23\": container with ID starting with 9034921533cb74b4038e562c22cbe8c3d5ae551b5c525291550914df0e0b2f23 not found: ID does not exist" containerID="9034921533cb74b4038e562c22cbe8c3d5ae551b5c525291550914df0e0b2f23" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.405663 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9034921533cb74b4038e562c22cbe8c3d5ae551b5c525291550914df0e0b2f23"} err="failed to get container status \"9034921533cb74b4038e562c22cbe8c3d5ae551b5c525291550914df0e0b2f23\": rpc error: code = NotFound desc = could not find container \"9034921533cb74b4038e562c22cbe8c3d5ae551b5c525291550914df0e0b2f23\": container with ID starting with 9034921533cb74b4038e562c22cbe8c3d5ae551b5c525291550914df0e0b2f23 not found: ID does not exist" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.405680 4781 scope.go:117] "RemoveContainer" containerID="a09a489914b002b5d556ed55478d0c7dfd3993be96f4eaaab0072e8b59250c15" Dec 02 10:08:45 crc kubenswrapper[4781]: E1202 10:08:45.406228 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09a489914b002b5d556ed55478d0c7dfd3993be96f4eaaab0072e8b59250c15\": container with ID starting with a09a489914b002b5d556ed55478d0c7dfd3993be96f4eaaab0072e8b59250c15 not found: ID does not exist" containerID="a09a489914b002b5d556ed55478d0c7dfd3993be96f4eaaab0072e8b59250c15" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.406254 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09a489914b002b5d556ed55478d0c7dfd3993be96f4eaaab0072e8b59250c15"} err="failed to get container status \"a09a489914b002b5d556ed55478d0c7dfd3993be96f4eaaab0072e8b59250c15\": rpc error: code = NotFound desc = could not find container \"a09a489914b002b5d556ed55478d0c7dfd3993be96f4eaaab0072e8b59250c15\": container with ID starting with a09a489914b002b5d556ed55478d0c7dfd3993be96f4eaaab0072e8b59250c15 not found: ID does not exist" Dec 02 10:08:45 crc kubenswrapper[4781]: I1202 10:08:45.512255 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f815fa8f-85f8-4659-8ecf-70a90acd6d0e" path="/var/lib/kubelet/pods/f815fa8f-85f8-4659-8ecf-70a90acd6d0e/volumes" Dec 02 10:08:47 crc kubenswrapper[4781]: I1202 10:08:47.624532 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z6gr2"] Dec 02 10:08:48 crc kubenswrapper[4781]: I1202 10:08:48.303211 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-z6gr2" podUID="e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8" containerName="registry-server" containerID="cri-o://1a5cb7345489ed224678931c9bdf79b12433500b3e6a1250c521dbf8cccbfad6" gracePeriod=2 Dec 02 10:08:48 crc kubenswrapper[4781]: I1202 10:08:48.755608 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6gr2" Dec 02 10:08:48 crc kubenswrapper[4781]: I1202 10:08:48.817693 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-catalog-content\") pod \"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8\" (UID: \"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8\") " Dec 02 10:08:48 crc kubenswrapper[4781]: I1202 10:08:48.817761 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-utilities\") pod \"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8\" (UID: \"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8\") " Dec 02 10:08:48 crc kubenswrapper[4781]: I1202 10:08:48.817862 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86w6x\" (UniqueName: \"kubernetes.io/projected/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-kube-api-access-86w6x\") pod \"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8\" (UID: \"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8\") " Dec 02 10:08:48 crc kubenswrapper[4781]: I1202 10:08:48.818543 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-utilities" (OuterVolumeSpecName: "utilities") pod "e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8" (UID: "e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:08:48 crc kubenswrapper[4781]: I1202 10:08:48.818689 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:48 crc kubenswrapper[4781]: I1202 10:08:48.827143 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-kube-api-access-86w6x" (OuterVolumeSpecName: "kube-api-access-86w6x") pod "e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8" (UID: "e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8"). InnerVolumeSpecName "kube-api-access-86w6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:08:48 crc kubenswrapper[4781]: I1202 10:08:48.874756 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8" (UID: "e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:08:48 crc kubenswrapper[4781]: I1202 10:08:48.921104 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:48 crc kubenswrapper[4781]: I1202 10:08:48.921155 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86w6x\" (UniqueName: \"kubernetes.io/projected/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8-kube-api-access-86w6x\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:49 crc kubenswrapper[4781]: I1202 10:08:49.314698 4781 generic.go:334] "Generic (PLEG): container finished" podID="e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8" containerID="1a5cb7345489ed224678931c9bdf79b12433500b3e6a1250c521dbf8cccbfad6" exitCode=0 Dec 02 10:08:49 crc kubenswrapper[4781]: I1202 10:08:49.314741 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6gr2" event={"ID":"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8","Type":"ContainerDied","Data":"1a5cb7345489ed224678931c9bdf79b12433500b3e6a1250c521dbf8cccbfad6"} Dec 02 10:08:49 crc kubenswrapper[4781]: I1202 10:08:49.314770 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6gr2" event={"ID":"e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8","Type":"ContainerDied","Data":"688968b63b2b027d6ab0941e27341ecb309b0a76cc230cc789bc192ed97d499c"} Dec 02 10:08:49 crc kubenswrapper[4781]: I1202 10:08:49.314795 4781 scope.go:117] "RemoveContainer" containerID="1a5cb7345489ed224678931c9bdf79b12433500b3e6a1250c521dbf8cccbfad6" Dec 02 10:08:49 crc kubenswrapper[4781]: I1202 10:08:49.314881 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6gr2" Dec 02 10:08:49 crc kubenswrapper[4781]: I1202 10:08:49.334100 4781 scope.go:117] "RemoveContainer" containerID="e3dfd3e5229d29d8eb5767450f520550806d8883135bb29d6022c3ade04f591c" Dec 02 10:08:49 crc kubenswrapper[4781]: I1202 10:08:49.352129 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z6gr2"] Dec 02 10:08:49 crc kubenswrapper[4781]: I1202 10:08:49.363251 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-z6gr2"] Dec 02 10:08:49 crc kubenswrapper[4781]: I1202 10:08:49.374982 4781 scope.go:117] "RemoveContainer" containerID="c535f40b3a1193244d237c7b97d7c6e2ee00f81d134562bf74e6c5e24f1f130c" Dec 02 10:08:49 crc kubenswrapper[4781]: I1202 10:08:49.401037 4781 scope.go:117] "RemoveContainer" containerID="1a5cb7345489ed224678931c9bdf79b12433500b3e6a1250c521dbf8cccbfad6" Dec 02 10:08:49 crc kubenswrapper[4781]: E1202 10:08:49.401405 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a5cb7345489ed224678931c9bdf79b12433500b3e6a1250c521dbf8cccbfad6\": container with ID starting with 1a5cb7345489ed224678931c9bdf79b12433500b3e6a1250c521dbf8cccbfad6 not found: ID does not exist" containerID="1a5cb7345489ed224678931c9bdf79b12433500b3e6a1250c521dbf8cccbfad6" Dec 02 10:08:49 crc kubenswrapper[4781]: I1202 10:08:49.401442 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a5cb7345489ed224678931c9bdf79b12433500b3e6a1250c521dbf8cccbfad6"} err="failed to get container status \"1a5cb7345489ed224678931c9bdf79b12433500b3e6a1250c521dbf8cccbfad6\": rpc error: code = NotFound desc = could not find container \"1a5cb7345489ed224678931c9bdf79b12433500b3e6a1250c521dbf8cccbfad6\": container with ID starting with 1a5cb7345489ed224678931c9bdf79b12433500b3e6a1250c521dbf8cccbfad6 not found: ID does not exist" Dec 02 10:08:49 crc kubenswrapper[4781]: I1202 10:08:49.401466 4781 scope.go:117] "RemoveContainer" containerID="e3dfd3e5229d29d8eb5767450f520550806d8883135bb29d6022c3ade04f591c" Dec 02 10:08:49 crc kubenswrapper[4781]: E1202 10:08:49.401820 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3dfd3e5229d29d8eb5767450f520550806d8883135bb29d6022c3ade04f591c\": container with ID starting with e3dfd3e5229d29d8eb5767450f520550806d8883135bb29d6022c3ade04f591c not found: ID does not exist" containerID="e3dfd3e5229d29d8eb5767450f520550806d8883135bb29d6022c3ade04f591c" Dec 02 10:08:49 crc kubenswrapper[4781]: I1202 10:08:49.401854 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3dfd3e5229d29d8eb5767450f520550806d8883135bb29d6022c3ade04f591c"} err="failed to get container status \"e3dfd3e5229d29d8eb5767450f520550806d8883135bb29d6022c3ade04f591c\": rpc error: code = NotFound desc = could not find container \"e3dfd3e5229d29d8eb5767450f520550806d8883135bb29d6022c3ade04f591c\": container with ID starting with e3dfd3e5229d29d8eb5767450f520550806d8883135bb29d6022c3ade04f591c not found: ID does not exist" Dec 02 10:08:49 crc kubenswrapper[4781]: I1202 10:08:49.401883 4781 scope.go:117] "RemoveContainer" containerID="c535f40b3a1193244d237c7b97d7c6e2ee00f81d134562bf74e6c5e24f1f130c" Dec 02 10:08:49 crc kubenswrapper[4781]: E1202 10:08:49.402199 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c535f40b3a1193244d237c7b97d7c6e2ee00f81d134562bf74e6c5e24f1f130c\": container with ID starting with c535f40b3a1193244d237c7b97d7c6e2ee00f81d134562bf74e6c5e24f1f130c not found: ID does not exist" containerID="c535f40b3a1193244d237c7b97d7c6e2ee00f81d134562bf74e6c5e24f1f130c" Dec 02 10:08:49 crc kubenswrapper[4781]: I1202 10:08:49.402235 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c535f40b3a1193244d237c7b97d7c6e2ee00f81d134562bf74e6c5e24f1f130c"} err="failed to get container status \"c535f40b3a1193244d237c7b97d7c6e2ee00f81d134562bf74e6c5e24f1f130c\": rpc error: code = NotFound desc = could not find container \"c535f40b3a1193244d237c7b97d7c6e2ee00f81d134562bf74e6c5e24f1f130c\": container with ID starting with c535f40b3a1193244d237c7b97d7c6e2ee00f81d134562bf74e6c5e24f1f130c not found: ID does not exist" Dec 02 10:08:49 crc kubenswrapper[4781]: I1202 10:08:49.509604 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8" path="/var/lib/kubelet/pods/e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8/volumes" Dec 02 10:08:53 crc kubenswrapper[4781]: I1202 10:08:53.364387 4781 generic.go:334] "Generic (PLEG): container finished" podID="dacd1d88-ff6e-4719-a24f-4feb0559f463" containerID="dc1aaf482f48e6e1387016fc6196a1ce91bd903952a8056bfc694730dc3043ac" exitCode=0 Dec 02 10:08:53 crc kubenswrapper[4781]: I1202 10:08:53.364934 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" event={"ID":"dacd1d88-ff6e-4719-a24f-4feb0559f463","Type":"ContainerDied","Data":"dc1aaf482f48e6e1387016fc6196a1ce91bd903952a8056bfc694730dc3043ac"} Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.821496 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.828205 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-cell1-compute-config-0\") pod \"dacd1d88-ff6e-4719-a24f-4feb0559f463\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.828290 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-ssh-key\") pod \"dacd1d88-ff6e-4719-a24f-4feb0559f463\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.828320 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-migration-ssh-key-0\") pod \"dacd1d88-ff6e-4719-a24f-4feb0559f463\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.828395 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-cell1-compute-config-1\") pod \"dacd1d88-ff6e-4719-a24f-4feb0559f463\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.828428 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-migration-ssh-key-1\") pod \"dacd1d88-ff6e-4719-a24f-4feb0559f463\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.828458 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-combined-ca-bundle\") pod \"dacd1d88-ff6e-4719-a24f-4feb0559f463\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.828541 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9twh\" (UniqueName: \"kubernetes.io/projected/dacd1d88-ff6e-4719-a24f-4feb0559f463-kube-api-access-m9twh\") pod \"dacd1d88-ff6e-4719-a24f-4feb0559f463\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.828574 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-inventory\") pod \"dacd1d88-ff6e-4719-a24f-4feb0559f463\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.828605 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-extra-config-0\") pod \"dacd1d88-ff6e-4719-a24f-4feb0559f463\" (UID: \"dacd1d88-ff6e-4719-a24f-4feb0559f463\") " Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.835706 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dacd1d88-ff6e-4719-a24f-4feb0559f463-kube-api-access-m9twh" (OuterVolumeSpecName: "kube-api-access-m9twh") pod "dacd1d88-ff6e-4719-a24f-4feb0559f463" (UID: "dacd1d88-ff6e-4719-a24f-4feb0559f463"). InnerVolumeSpecName "kube-api-access-m9twh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.836523 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "dacd1d88-ff6e-4719-a24f-4feb0559f463" (UID: "dacd1d88-ff6e-4719-a24f-4feb0559f463"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.864866 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "dacd1d88-ff6e-4719-a24f-4feb0559f463" (UID: "dacd1d88-ff6e-4719-a24f-4feb0559f463"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.867143 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "dacd1d88-ff6e-4719-a24f-4feb0559f463" (UID: "dacd1d88-ff6e-4719-a24f-4feb0559f463"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.877113 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "dacd1d88-ff6e-4719-a24f-4feb0559f463" (UID: "dacd1d88-ff6e-4719-a24f-4feb0559f463"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.884055 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "dacd1d88-ff6e-4719-a24f-4feb0559f463" (UID: "dacd1d88-ff6e-4719-a24f-4feb0559f463"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.884487 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dacd1d88-ff6e-4719-a24f-4feb0559f463" (UID: "dacd1d88-ff6e-4719-a24f-4feb0559f463"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.886252 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "dacd1d88-ff6e-4719-a24f-4feb0559f463" (UID: "dacd1d88-ff6e-4719-a24f-4feb0559f463"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.888242 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-inventory" (OuterVolumeSpecName: "inventory") pod "dacd1d88-ff6e-4719-a24f-4feb0559f463" (UID: "dacd1d88-ff6e-4719-a24f-4feb0559f463"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.929868 4781 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.929907 4781 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.929921 4781 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.929953 4781 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.929967 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9twh\" (UniqueName: \"kubernetes.io/projected/dacd1d88-ff6e-4719-a24f-4feb0559f463-kube-api-access-m9twh\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.929978 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.929986 4781 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.929994 4781 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:54 crc kubenswrapper[4781]: I1202 10:08:54.930004 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dacd1d88-ff6e-4719-a24f-4feb0559f463-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.384865 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" event={"ID":"dacd1d88-ff6e-4719-a24f-4feb0559f463","Type":"ContainerDied","Data":"2e423f8ee36bfa0d17fc1d3433645458c2d2454452c5e54e99de5759f9a961cb"} Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.384908 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e423f8ee36bfa0d17fc1d3433645458c2d2454452c5e54e99de5759f9a961cb" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.384982 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g88rg" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.483185 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n"] Dec 02 10:08:55 crc kubenswrapper[4781]: E1202 10:08:55.483618 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd229c63-1675-45d9-8190-6b3adbad8af5" containerName="extract-utilities" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.483639 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd229c63-1675-45d9-8190-6b3adbad8af5" containerName="extract-utilities" Dec 02 10:08:55 crc kubenswrapper[4781]: E1202 10:08:55.483653 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="454ad7ba-1c5e-43c4-9a29-e02e29aa7544" containerName="registry-server" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.483660 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="454ad7ba-1c5e-43c4-9a29-e02e29aa7544" containerName="registry-server" Dec 02 10:08:55 crc kubenswrapper[4781]: E1202 10:08:55.483674 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f815fa8f-85f8-4659-8ecf-70a90acd6d0e" containerName="extract-content" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.483679 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f815fa8f-85f8-4659-8ecf-70a90acd6d0e" containerName="extract-content" Dec 02 10:08:55 crc kubenswrapper[4781]: E1202 10:08:55.483691 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8" containerName="registry-server" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.483698 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8" containerName="registry-server" Dec 02 10:08:55 crc kubenswrapper[4781]: E1202 10:08:55.483709 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="454ad7ba-1c5e-43c4-9a29-e02e29aa7544" containerName="extract-utilities" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.483717 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="454ad7ba-1c5e-43c4-9a29-e02e29aa7544" containerName="extract-utilities" Dec 02 10:08:55 crc kubenswrapper[4781]: E1202 10:08:55.483726 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="454ad7ba-1c5e-43c4-9a29-e02e29aa7544" containerName="extract-content" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.483733 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="454ad7ba-1c5e-43c4-9a29-e02e29aa7544" containerName="extract-content" Dec 02 10:08:55 crc kubenswrapper[4781]: E1202 10:08:55.483751 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f815fa8f-85f8-4659-8ecf-70a90acd6d0e" containerName="extract-utilities" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.483759 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f815fa8f-85f8-4659-8ecf-70a90acd6d0e" containerName="extract-utilities" Dec 02 10:08:55 crc kubenswrapper[4781]: E1202 10:08:55.483777 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8" containerName="extract-utilities" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.483784 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8" containerName="extract-utilities" Dec 02 10:08:55 crc kubenswrapper[4781]: E1202 10:08:55.483798 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f815fa8f-85f8-4659-8ecf-70a90acd6d0e" containerName="registry-server" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.483805 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f815fa8f-85f8-4659-8ecf-70a90acd6d0e" containerName="registry-server" Dec 02 10:08:55 crc kubenswrapper[4781]: E1202 10:08:55.483815 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8" containerName="extract-content" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.483821 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8" containerName="extract-content" Dec 02 10:08:55 crc kubenswrapper[4781]: E1202 10:08:55.483839 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd229c63-1675-45d9-8190-6b3adbad8af5" containerName="registry-server" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.483845 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd229c63-1675-45d9-8190-6b3adbad8af5" containerName="registry-server" Dec 02 10:08:55 crc kubenswrapper[4781]: E1202 10:08:55.483857 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dacd1d88-ff6e-4719-a24f-4feb0559f463" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.483863 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dacd1d88-ff6e-4719-a24f-4feb0559f463" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 10:08:55 crc kubenswrapper[4781]: E1202 10:08:55.483874 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd229c63-1675-45d9-8190-6b3adbad8af5" containerName="extract-content" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.483880 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd229c63-1675-45d9-8190-6b3adbad8af5" containerName="extract-content" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.484095 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd229c63-1675-45d9-8190-6b3adbad8af5" containerName="registry-server" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.484114 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="454ad7ba-1c5e-43c4-9a29-e02e29aa7544" containerName="registry-server" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.484133 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="dacd1d88-ff6e-4719-a24f-4feb0559f463" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.484152 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f815fa8f-85f8-4659-8ecf-70a90acd6d0e" containerName="registry-server" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.484160 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f10cf6-3aed-4581-a2a6-bf8d6690aeb8" containerName="registry-server" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.484804 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.487237 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.488741 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.488741 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.488775 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zl624" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.488778 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.512116 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n"] Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.542642 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.542752 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.542791 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.542815 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.542848 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.542909 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.542995 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpbfv\" (UniqueName: \"kubernetes.io/projected/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-kube-api-access-qpbfv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.644478 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.644564 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.644595 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.644614 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.644637 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.644656 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.644695 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpbfv\" (UniqueName: \"kubernetes.io/projected/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-kube-api-access-qpbfv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.648614 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.648622 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.648904 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.649151 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.655457 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.655574 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.661715 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpbfv\" (UniqueName: \"kubernetes.io/projected/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-kube-api-access-qpbfv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:55 crc kubenswrapper[4781]: I1202 10:08:55.801665 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:08:56 crc kubenswrapper[4781]: I1202 10:08:56.310777 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n"] Dec 02 10:08:56 crc kubenswrapper[4781]: I1202 10:08:56.393311 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" event={"ID":"82d7f2c1-dd68-42f7-be56-30bc507b2bf5","Type":"ContainerStarted","Data":"749b287cdc8ab82eee67f025702e607c33195d9619f497af180190240b71807b"} Dec 02 10:08:58 crc kubenswrapper[4781]: I1202 10:08:58.417316 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" event={"ID":"82d7f2c1-dd68-42f7-be56-30bc507b2bf5","Type":"ContainerStarted","Data":"ba036cdf86deca5c24ce7cdc6a65ff4d4bec1639c8d921a24d396fb8df57b2e8"} Dec 02 10:08:58 crc kubenswrapper[4781]: I1202 10:08:58.441810 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" podStartSLOduration=2.45801711 podStartE2EDuration="3.441789774s" podCreationTimestamp="2025-12-02 10:08:55 +0000 UTC" firstStartedPulling="2025-12-02 10:08:56.31756302 +0000 UTC m=+2899.141436899" lastFinishedPulling="2025-12-02 10:08:57.301335684 +0000 UTC m=+2900.125209563" observedRunningTime="2025-12-02 10:08:58.435773341 +0000 UTC m=+2901.259647220" watchObservedRunningTime="2025-12-02 10:08:58.441789774 +0000 UTC m=+2901.265663653" Dec 02 10:10:00 crc kubenswrapper[4781]: I1202 10:10:00.412942 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:10:00 crc kubenswrapper[4781]: I1202 10:10:00.413533 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:10:30 crc kubenswrapper[4781]: I1202 10:10:30.412286 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:10:30 crc kubenswrapper[4781]: I1202 10:10:30.412899 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:11:00 crc kubenswrapper[4781]: I1202 10:11:00.411774 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:11:00 crc kubenswrapper[4781]: I1202 10:11:00.412315 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:11:00 crc kubenswrapper[4781]: I1202 10:11:00.412359 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 10:11:00 crc kubenswrapper[4781]: I1202 10:11:00.413104 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea"} pod="openshift-machine-config-operator/machine-config-daemon-pzntm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 10:11:00 crc kubenswrapper[4781]: I1202 10:11:00.413161 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" containerID="cri-o://a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" gracePeriod=600 Dec 02 10:11:00 crc kubenswrapper[4781]: E1202 10:11:00.543596 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:11:01 crc kubenswrapper[4781]: I1202 10:11:01.458045 4781 generic.go:334] "Generic (PLEG): container finished" podID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" exitCode=0 Dec 02 10:11:01 crc kubenswrapper[4781]: I1202 10:11:01.458087 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerDied","Data":"a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea"} Dec 02 10:11:01 crc kubenswrapper[4781]: I1202 10:11:01.458123 4781 scope.go:117] "RemoveContainer" containerID="75b74b7096d1dd0ce909b2158eaf421368960aa32dc50742e6efb662b8a4be29" Dec 02 10:11:01 crc kubenswrapper[4781]: I1202 10:11:01.458673 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:11:01 crc kubenswrapper[4781]: E1202 10:11:01.458994 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:11:15 crc kubenswrapper[4781]: I1202 10:11:15.500070 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:11:15 crc kubenswrapper[4781]: E1202 10:11:15.502770 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:11:15 crc kubenswrapper[4781]: I1202 10:11:15.580569 4781 generic.go:334] "Generic (PLEG): container finished" podID="82d7f2c1-dd68-42f7-be56-30bc507b2bf5" containerID="ba036cdf86deca5c24ce7cdc6a65ff4d4bec1639c8d921a24d396fb8df57b2e8" exitCode=0 Dec 02 10:11:15 crc kubenswrapper[4781]: I1202 10:11:15.580720 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" event={"ID":"82d7f2c1-dd68-42f7-be56-30bc507b2bf5","Type":"ContainerDied","Data":"ba036cdf86deca5c24ce7cdc6a65ff4d4bec1639c8d921a24d396fb8df57b2e8"} Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.009054 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.137136 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-1\") pod \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.137206 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-telemetry-combined-ca-bundle\") pod \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.137318 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-0\") pod \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.137357 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-inventory\") pod \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.137378 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpbfv\" (UniqueName: \"kubernetes.io/projected/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-kube-api-access-qpbfv\") pod \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.137486 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-2\") pod \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.138703 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ssh-key\") pod \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\" (UID: \"82d7f2c1-dd68-42f7-be56-30bc507b2bf5\") " Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.145612 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "82d7f2c1-dd68-42f7-be56-30bc507b2bf5" (UID: "82d7f2c1-dd68-42f7-be56-30bc507b2bf5"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.146487 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-kube-api-access-qpbfv" (OuterVolumeSpecName: "kube-api-access-qpbfv") pod "82d7f2c1-dd68-42f7-be56-30bc507b2bf5" (UID: "82d7f2c1-dd68-42f7-be56-30bc507b2bf5"). InnerVolumeSpecName "kube-api-access-qpbfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.169916 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "82d7f2c1-dd68-42f7-be56-30bc507b2bf5" (UID: "82d7f2c1-dd68-42f7-be56-30bc507b2bf5"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.175873 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "82d7f2c1-dd68-42f7-be56-30bc507b2bf5" (UID: "82d7f2c1-dd68-42f7-be56-30bc507b2bf5"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.181264 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "82d7f2c1-dd68-42f7-be56-30bc507b2bf5" (UID: "82d7f2c1-dd68-42f7-be56-30bc507b2bf5"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.182742 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "82d7f2c1-dd68-42f7-be56-30bc507b2bf5" (UID: "82d7f2c1-dd68-42f7-be56-30bc507b2bf5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.183266 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-inventory" (OuterVolumeSpecName: "inventory") pod "82d7f2c1-dd68-42f7-be56-30bc507b2bf5" (UID: "82d7f2c1-dd68-42f7-be56-30bc507b2bf5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.240822 4781 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.240855 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.240869 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpbfv\" (UniqueName: \"kubernetes.io/projected/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-kube-api-access-qpbfv\") on node \"crc\" DevicePath \"\"" Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.240882 4781 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.240892 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.240906 4781 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.240917 4781 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82d7f2c1-dd68-42f7-be56-30bc507b2bf5-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.602752 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" event={"ID":"82d7f2c1-dd68-42f7-be56-30bc507b2bf5","Type":"ContainerDied","Data":"749b287cdc8ab82eee67f025702e607c33195d9619f497af180190240b71807b"} Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.602798 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="749b287cdc8ab82eee67f025702e607c33195d9619f497af180190240b71807b" Dec 02 10:11:17 crc kubenswrapper[4781]: I1202 10:11:17.602948 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n" Dec 02 10:11:30 crc kubenswrapper[4781]: I1202 10:11:30.500155 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:11:30 crc kubenswrapper[4781]: E1202 10:11:30.500977 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:11:43 crc kubenswrapper[4781]: I1202 10:11:43.500131 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:11:43 crc kubenswrapper[4781]: E1202 10:11:43.501052 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:11:55 crc kubenswrapper[4781]: I1202 10:11:55.500163 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:11:55 crc kubenswrapper[4781]: E1202 10:11:55.502153 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.499647 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:12:10 crc kubenswrapper[4781]: E1202 10:12:10.500435 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.629787 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 10:12:10 crc kubenswrapper[4781]: E1202 10:12:10.630279 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d7f2c1-dd68-42f7-be56-30bc507b2bf5" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.630307 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d7f2c1-dd68-42f7-be56-30bc507b2bf5" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.630589 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d7f2c1-dd68-42f7-be56-30bc507b2bf5" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.631643 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.634197 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.634247 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.634317 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.634770 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-zm7cr" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.642685 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ac589c1e-71f7-423f-b99b-3ebf175b40f3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.642732 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac589c1e-71f7-423f-b99b-3ebf175b40f3-config-data\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.642777 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.642872 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.744348 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ac589c1e-71f7-423f-b99b-3ebf175b40f3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.744489 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ac589c1e-71f7-423f-b99b-3ebf175b40f3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.744535 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac589c1e-71f7-423f-b99b-3ebf175b40f3-config-data\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.745561 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ac589c1e-71f7-423f-b99b-3ebf175b40f3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.746277 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac589c1e-71f7-423f-b99b-3ebf175b40f3-config-data\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.746370 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.746469 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.746514 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.746548 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ac589c1e-71f7-423f-b99b-3ebf175b40f3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.746583 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.746599 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kqsw\" (UniqueName: \"kubernetes.io/projected/ac589c1e-71f7-423f-b99b-3ebf175b40f3-kube-api-access-5kqsw\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.753780 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.848572 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.848701 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.848736 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ac589c1e-71f7-423f-b99b-3ebf175b40f3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.848772 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.848796 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kqsw\" (UniqueName: \"kubernetes.io/projected/ac589c1e-71f7-423f-b99b-3ebf175b40f3-kube-api-access-5kqsw\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.848845 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ac589c1e-71f7-423f-b99b-3ebf175b40f3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.849137 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.849271 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ac589c1e-71f7-423f-b99b-3ebf175b40f3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.849441 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ac589c1e-71f7-423f-b99b-3ebf175b40f3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.852514 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.852645 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.865133 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kqsw\" (UniqueName: \"kubernetes.io/projected/ac589c1e-71f7-423f-b99b-3ebf175b40f3-kube-api-access-5kqsw\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.875348 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " pod="openstack/tempest-tests-tempest" Dec 02 10:12:10 crc kubenswrapper[4781]: I1202 10:12:10.955974 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 10:12:11 crc kubenswrapper[4781]: I1202 10:12:11.523088 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 10:12:11 crc kubenswrapper[4781]: I1202 10:12:11.535623 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 10:12:12 crc kubenswrapper[4781]: I1202 10:12:12.106587 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ac589c1e-71f7-423f-b99b-3ebf175b40f3","Type":"ContainerStarted","Data":"a363c6a8372cbb779cdbc71f0b292476a80cd24b1cd808e08f5e6adaf764366f"} Dec 02 10:12:23 crc kubenswrapper[4781]: I1202 10:12:23.501107 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:12:23 crc kubenswrapper[4781]: E1202 10:12:23.501831 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:12:36 crc kubenswrapper[4781]: I1202 10:12:36.499759 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:12:36 crc kubenswrapper[4781]: E1202 10:12:36.500584 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:12:45 crc kubenswrapper[4781]: E1202 10:12:45.998533 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 02 10:12:46 crc kubenswrapper[4781]: E1202 10:12:45.999226 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5kqsw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(ac589c1e-71f7-423f-b99b-3ebf175b40f3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 10:12:46 crc kubenswrapper[4781]: E1202 10:12:46.000770 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="ac589c1e-71f7-423f-b99b-3ebf175b40f3" Dec 02 10:12:46 crc kubenswrapper[4781]: E1202 10:12:46.438548 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="ac589c1e-71f7-423f-b99b-3ebf175b40f3" Dec 02 10:12:49 crc kubenswrapper[4781]: I1202 10:12:49.500042 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:12:49 crc kubenswrapper[4781]: E1202 10:12:49.500596 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:12:59 crc kubenswrapper[4781]: I1202 10:12:59.956071 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 02 10:13:01 crc kubenswrapper[4781]: I1202 10:13:01.500001 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:13:01 crc kubenswrapper[4781]: E1202 10:13:01.500784 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:13:01 crc kubenswrapper[4781]: I1202 10:13:01.574251 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ac589c1e-71f7-423f-b99b-3ebf175b40f3","Type":"ContainerStarted","Data":"8d172fe6136e87003109bf87428976ab924eb793e112978cde99e3d6975e7f59"} Dec 02 10:13:01 crc kubenswrapper[4781]: I1202 10:13:01.595004 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.176563821 podStartE2EDuration="52.59498586s" podCreationTimestamp="2025-12-02 10:12:09 +0000 UTC" firstStartedPulling="2025-12-02 10:12:11.535274763 +0000 UTC m=+3094.359148642" lastFinishedPulling="2025-12-02 10:12:59.953696802 +0000 UTC m=+3142.777570681" observedRunningTime="2025-12-02 10:13:01.594710432 +0000 UTC m=+3144.418584331" watchObservedRunningTime="2025-12-02 10:13:01.59498586 +0000 UTC m=+3144.418859729" Dec 02 10:13:12 crc kubenswrapper[4781]: I1202 10:13:12.500077 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:13:12 crc kubenswrapper[4781]: E1202 10:13:12.500974 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:13:25 crc kubenswrapper[4781]: I1202 10:13:25.499526 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:13:25 crc kubenswrapper[4781]: E1202 10:13:25.500311 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:13:40 crc kubenswrapper[4781]: I1202 10:13:40.500054 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:13:40 crc kubenswrapper[4781]: E1202 10:13:40.500842 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:13:51 crc kubenswrapper[4781]: I1202 10:13:51.499591 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:13:51 crc kubenswrapper[4781]: E1202 10:13:51.500400 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:14:04 crc kubenswrapper[4781]: I1202 10:14:04.499562 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:14:04 crc kubenswrapper[4781]: E1202 10:14:04.500350 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:14:18 crc kubenswrapper[4781]: I1202 10:14:18.504069 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:14:18 crc kubenswrapper[4781]: E1202 10:14:18.509789 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:14:30 crc kubenswrapper[4781]: I1202 10:14:30.500024 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:14:30 crc kubenswrapper[4781]: E1202 10:14:30.500809 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:14:42 crc kubenswrapper[4781]: I1202 10:14:42.500427 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:14:42 crc kubenswrapper[4781]: E1202 10:14:42.501329 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:14:57 crc kubenswrapper[4781]: I1202 10:14:57.508750 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:14:57 crc kubenswrapper[4781]: E1202 10:14:57.509967 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:15:00 crc kubenswrapper[4781]: I1202 10:15:00.152472 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4"] Dec 02 10:15:00 crc kubenswrapper[4781]: I1202 10:15:00.153719 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4" Dec 02 10:15:00 crc kubenswrapper[4781]: I1202 10:15:00.155808 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 10:15:00 crc kubenswrapper[4781]: I1202 10:15:00.156101 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 10:15:00 crc kubenswrapper[4781]: I1202 10:15:00.183582 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4"] Dec 02 10:15:00 crc kubenswrapper[4781]: I1202 10:15:00.325433 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc58k\" (UniqueName: \"kubernetes.io/projected/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-kube-api-access-rc58k\") pod \"collect-profiles-29411175-9gmd4\" (UID: \"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4" Dec 02 10:15:00 crc kubenswrapper[4781]: I1202 10:15:00.325529 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-config-volume\") pod \"collect-profiles-29411175-9gmd4\" (UID: \"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4" Dec 02 10:15:00 crc kubenswrapper[4781]: I1202 10:15:00.325745 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-secret-volume\") pod \"collect-profiles-29411175-9gmd4\" (UID: \"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4" Dec 02 10:15:00 crc kubenswrapper[4781]: I1202 10:15:00.428214 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc58k\" (UniqueName: \"kubernetes.io/projected/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-kube-api-access-rc58k\") pod \"collect-profiles-29411175-9gmd4\" (UID: \"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4" Dec 02 10:15:00 crc kubenswrapper[4781]: I1202 10:15:00.428321 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-config-volume\") pod \"collect-profiles-29411175-9gmd4\" (UID: \"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4" Dec 02 10:15:00 crc kubenswrapper[4781]: I1202 10:15:00.428387 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-secret-volume\") pod \"collect-profiles-29411175-9gmd4\" (UID: \"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4" Dec 02 10:15:00 crc kubenswrapper[4781]: I1202 10:15:00.429602 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-config-volume\") pod \"collect-profiles-29411175-9gmd4\" (UID: \"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4" Dec 02 10:15:00 crc kubenswrapper[4781]: I1202 10:15:00.445605 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-secret-volume\") pod \"collect-profiles-29411175-9gmd4\" (UID: \"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4" Dec 02 10:15:00 crc kubenswrapper[4781]: I1202 10:15:00.447412 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc58k\" (UniqueName: \"kubernetes.io/projected/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-kube-api-access-rc58k\") pod \"collect-profiles-29411175-9gmd4\" (UID: \"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4" Dec 02 10:15:00 crc kubenswrapper[4781]: I1202 10:15:00.483544 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4" Dec 02 10:15:00 crc kubenswrapper[4781]: I1202 10:15:00.936163 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4"] Dec 02 10:15:01 crc kubenswrapper[4781]: I1202 10:15:01.640644 4781 generic.go:334] "Generic (PLEG): container finished" podID="3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1" containerID="1ea596ee67bf29c98bec390a6a934b08068031afde8ee4fcda6d9f1779c29f58" exitCode=0 Dec 02 10:15:01 crc kubenswrapper[4781]: I1202 10:15:01.640702 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4" event={"ID":"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1","Type":"ContainerDied","Data":"1ea596ee67bf29c98bec390a6a934b08068031afde8ee4fcda6d9f1779c29f58"} Dec 02 10:15:01 crc kubenswrapper[4781]: I1202 10:15:01.640874 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4" event={"ID":"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1","Type":"ContainerStarted","Data":"7e00e0f8f21ba8d97e7c7274d2e7f096f60c856727bc95d9c33ebeb6090b6aa7"} Dec 02 10:15:03 crc kubenswrapper[4781]: I1202 10:15:03.072386 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4" Dec 02 10:15:03 crc kubenswrapper[4781]: I1202 10:15:03.095796 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-secret-volume\") pod \"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1\" (UID: \"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1\") " Dec 02 10:15:03 crc kubenswrapper[4781]: I1202 10:15:03.096813 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-config-volume\") pod \"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1\" (UID: \"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1\") " Dec 02 10:15:03 crc kubenswrapper[4781]: I1202 10:15:03.096913 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc58k\" (UniqueName: \"kubernetes.io/projected/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-kube-api-access-rc58k\") pod \"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1\" (UID: \"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1\") " Dec 02 10:15:03 crc kubenswrapper[4781]: I1202 10:15:03.097626 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-config-volume" (OuterVolumeSpecName: "config-volume") pod "3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1" (UID: "3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 10:15:03 crc kubenswrapper[4781]: I1202 10:15:03.142599 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1" (UID: "3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:15:03 crc kubenswrapper[4781]: I1202 10:15:03.142826 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-kube-api-access-rc58k" (OuterVolumeSpecName: "kube-api-access-rc58k") pod "3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1" (UID: "3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1"). InnerVolumeSpecName "kube-api-access-rc58k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:15:03 crc kubenswrapper[4781]: I1202 10:15:03.199020 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 10:15:03 crc kubenswrapper[4781]: I1202 10:15:03.199053 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 10:15:03 crc kubenswrapper[4781]: I1202 10:15:03.199064 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc58k\" (UniqueName: \"kubernetes.io/projected/3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1-kube-api-access-rc58k\") on node \"crc\" DevicePath \"\"" Dec 02 10:15:03 crc kubenswrapper[4781]: I1202 10:15:03.660556 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4" event={"ID":"3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1","Type":"ContainerDied","Data":"7e00e0f8f21ba8d97e7c7274d2e7f096f60c856727bc95d9c33ebeb6090b6aa7"} Dec 02 10:15:03 crc kubenswrapper[4781]: I1202 10:15:03.660869 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e00e0f8f21ba8d97e7c7274d2e7f096f60c856727bc95d9c33ebeb6090b6aa7" Dec 02 10:15:03 crc kubenswrapper[4781]: I1202 10:15:03.660638 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-9gmd4" Dec 02 10:15:04 crc kubenswrapper[4781]: I1202 10:15:04.140821 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5"] Dec 02 10:15:04 crc kubenswrapper[4781]: I1202 10:15:04.148413 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411130-s9jw5"] Dec 02 10:15:05 crc kubenswrapper[4781]: I1202 10:15:05.513301 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bef575f2-653f-42b4-a40a-33610a164402" path="/var/lib/kubelet/pods/bef575f2-653f-42b4-a40a-33610a164402/volumes" Dec 02 10:15:08 crc kubenswrapper[4781]: I1202 10:15:08.500458 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:15:08 crc kubenswrapper[4781]: E1202 10:15:08.501335 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:15:20 crc kubenswrapper[4781]: I1202 10:15:20.499870 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:15:20 crc kubenswrapper[4781]: E1202 10:15:20.500787 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:15:31 crc kubenswrapper[4781]: I1202 10:15:31.499670 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:15:31 crc kubenswrapper[4781]: E1202 10:15:31.500503 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:15:44 crc kubenswrapper[4781]: I1202 10:15:44.500227 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:15:44 crc kubenswrapper[4781]: E1202 10:15:44.501177 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:15:45 crc kubenswrapper[4781]: I1202 10:15:45.036794 4781 scope.go:117] "RemoveContainer" containerID="ee52b9632095920fd4cb794cb36f983e8884d348294428288218946fddf7fe31" Dec 02 10:15:58 crc kubenswrapper[4781]: I1202 10:15:58.500224 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:15:58 crc kubenswrapper[4781]: E1202 10:15:58.501133 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:16:12 crc kubenswrapper[4781]: I1202 10:16:12.499573 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:16:13 crc kubenswrapper[4781]: I1202 10:16:13.260471 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"4e497409fcbb803030a10c1c523917978b588e2d6baad2c22809c97c56aec2e5"} Dec 02 10:18:30 crc kubenswrapper[4781]: I1202 10:18:30.412096 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:18:30 crc kubenswrapper[4781]: I1202 10:18:30.413129 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:19:00 crc kubenswrapper[4781]: I1202 10:19:00.412208 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:19:00 crc kubenswrapper[4781]: I1202 10:19:00.413616 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:19:28 crc kubenswrapper[4781]: I1202 10:19:28.165841 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-khwt8"] Dec 02 10:19:28 crc kubenswrapper[4781]: E1202 10:19:28.166904 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1" containerName="collect-profiles" Dec 02 10:19:28 crc kubenswrapper[4781]: I1202 10:19:28.166945 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1" containerName="collect-profiles" Dec 02 10:19:28 crc kubenswrapper[4781]: I1202 10:19:28.167193 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ddd7ae9-72b4-4cd4-8aed-7bba46a744b1" containerName="collect-profiles" Dec 02 10:19:28 crc kubenswrapper[4781]: I1202 10:19:28.168743 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-khwt8" Dec 02 10:19:28 crc kubenswrapper[4781]: I1202 10:19:28.176873 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-khwt8"] Dec 02 10:19:28 crc kubenswrapper[4781]: I1202 10:19:28.332346 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp7zh\" (UniqueName: \"kubernetes.io/projected/c84f12f5-b188-4f66-904e-fe5afd92fcec-kube-api-access-dp7zh\") pod \"redhat-marketplace-khwt8\" (UID: \"c84f12f5-b188-4f66-904e-fe5afd92fcec\") " pod="openshift-marketplace/redhat-marketplace-khwt8" Dec 02 10:19:28 crc kubenswrapper[4781]: I1202 10:19:28.332429 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c84f12f5-b188-4f66-904e-fe5afd92fcec-catalog-content\") pod \"redhat-marketplace-khwt8\" (UID: \"c84f12f5-b188-4f66-904e-fe5afd92fcec\") " pod="openshift-marketplace/redhat-marketplace-khwt8" Dec 02 10:19:28 crc kubenswrapper[4781]: I1202 10:19:28.332564 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c84f12f5-b188-4f66-904e-fe5afd92fcec-utilities\") pod \"redhat-marketplace-khwt8\" (UID: \"c84f12f5-b188-4f66-904e-fe5afd92fcec\") " pod="openshift-marketplace/redhat-marketplace-khwt8" Dec 02 10:19:28 crc kubenswrapper[4781]: I1202 10:19:28.434576 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp7zh\" (UniqueName: \"kubernetes.io/projected/c84f12f5-b188-4f66-904e-fe5afd92fcec-kube-api-access-dp7zh\") pod \"redhat-marketplace-khwt8\" (UID: \"c84f12f5-b188-4f66-904e-fe5afd92fcec\") " pod="openshift-marketplace/redhat-marketplace-khwt8" Dec 02 10:19:28 crc kubenswrapper[4781]: I1202 10:19:28.434652 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c84f12f5-b188-4f66-904e-fe5afd92fcec-catalog-content\") pod \"redhat-marketplace-khwt8\" (UID: \"c84f12f5-b188-4f66-904e-fe5afd92fcec\") " pod="openshift-marketplace/redhat-marketplace-khwt8" Dec 02 10:19:28 crc kubenswrapper[4781]: I1202 10:19:28.434711 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c84f12f5-b188-4f66-904e-fe5afd92fcec-utilities\") pod \"redhat-marketplace-khwt8\" (UID: \"c84f12f5-b188-4f66-904e-fe5afd92fcec\") " pod="openshift-marketplace/redhat-marketplace-khwt8" Dec 02 10:19:28 crc kubenswrapper[4781]: I1202 10:19:28.435220 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c84f12f5-b188-4f66-904e-fe5afd92fcec-utilities\") pod \"redhat-marketplace-khwt8\" (UID: \"c84f12f5-b188-4f66-904e-fe5afd92fcec\") " pod="openshift-marketplace/redhat-marketplace-khwt8" Dec 02 10:19:28 crc kubenswrapper[4781]: I1202 10:19:28.435403 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c84f12f5-b188-4f66-904e-fe5afd92fcec-catalog-content\") pod \"redhat-marketplace-khwt8\" (UID: \"c84f12f5-b188-4f66-904e-fe5afd92fcec\") " pod="openshift-marketplace/redhat-marketplace-khwt8" Dec 02 10:19:28 crc kubenswrapper[4781]: I1202 10:19:28.470898 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp7zh\" (UniqueName: \"kubernetes.io/projected/c84f12f5-b188-4f66-904e-fe5afd92fcec-kube-api-access-dp7zh\") pod \"redhat-marketplace-khwt8\" (UID: \"c84f12f5-b188-4f66-904e-fe5afd92fcec\") " pod="openshift-marketplace/redhat-marketplace-khwt8" Dec 02 10:19:28 crc kubenswrapper[4781]: I1202 10:19:28.493720 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-khwt8" Dec 02 10:19:29 crc kubenswrapper[4781]: I1202 10:19:29.083279 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-khwt8"] Dec 02 10:19:30 crc kubenswrapper[4781]: I1202 10:19:30.055884 4781 generic.go:334] "Generic (PLEG): container finished" podID="c84f12f5-b188-4f66-904e-fe5afd92fcec" containerID="a95ecf9598e108681a1d33007f5a088810b869d31b4d0ff913bcbaa855706e63" exitCode=0 Dec 02 10:19:30 crc kubenswrapper[4781]: I1202 10:19:30.055986 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khwt8" event={"ID":"c84f12f5-b188-4f66-904e-fe5afd92fcec","Type":"ContainerDied","Data":"a95ecf9598e108681a1d33007f5a088810b869d31b4d0ff913bcbaa855706e63"} Dec 02 10:19:30 crc kubenswrapper[4781]: I1202 10:19:30.057056 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khwt8" event={"ID":"c84f12f5-b188-4f66-904e-fe5afd92fcec","Type":"ContainerStarted","Data":"242ffa89a28187967d679d0f3b63b57abeaac0347113274a853e1fc7b442d6c8"} Dec 02 10:19:30 crc kubenswrapper[4781]: I1202 10:19:30.058539 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 10:19:30 crc kubenswrapper[4781]: I1202 10:19:30.411792 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:19:30 crc kubenswrapper[4781]: I1202 10:19:30.411846 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:19:30 crc kubenswrapper[4781]: I1202 10:19:30.411883 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 10:19:30 crc kubenswrapper[4781]: I1202 10:19:30.412615 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e497409fcbb803030a10c1c523917978b588e2d6baad2c22809c97c56aec2e5"} pod="openshift-machine-config-operator/machine-config-daemon-pzntm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 10:19:30 crc kubenswrapper[4781]: I1202 10:19:30.412674 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" containerID="cri-o://4e497409fcbb803030a10c1c523917978b588e2d6baad2c22809c97c56aec2e5" gracePeriod=600 Dec 02 10:19:31 crc kubenswrapper[4781]: I1202 10:19:31.067114 4781 generic.go:334] "Generic (PLEG): container finished" podID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerID="4e497409fcbb803030a10c1c523917978b588e2d6baad2c22809c97c56aec2e5" exitCode=0 Dec 02 10:19:31 crc kubenswrapper[4781]: I1202 10:19:31.067409 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerDied","Data":"4e497409fcbb803030a10c1c523917978b588e2d6baad2c22809c97c56aec2e5"} Dec 02 10:19:31 crc kubenswrapper[4781]: I1202 10:19:31.067436 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388"} Dec 02 10:19:31 crc kubenswrapper[4781]: I1202 10:19:31.067454 4781 scope.go:117] "RemoveContainer" containerID="a460aa145b859b67418ac19afdfd646e23ad888c821073566f356df84809b0ea" Dec 02 10:19:32 crc kubenswrapper[4781]: I1202 10:19:32.077615 4781 generic.go:334] "Generic (PLEG): container finished" podID="c84f12f5-b188-4f66-904e-fe5afd92fcec" containerID="aa1f91f65c2f081968ea3839cc434fd825fdcb506d57bcbd7ceac5256bca3fa5" exitCode=0 Dec 02 10:19:32 crc kubenswrapper[4781]: I1202 10:19:32.077667 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khwt8" event={"ID":"c84f12f5-b188-4f66-904e-fe5afd92fcec","Type":"ContainerDied","Data":"aa1f91f65c2f081968ea3839cc434fd825fdcb506d57bcbd7ceac5256bca3fa5"} Dec 02 10:19:33 crc kubenswrapper[4781]: I1202 10:19:33.095461 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khwt8" event={"ID":"c84f12f5-b188-4f66-904e-fe5afd92fcec","Type":"ContainerStarted","Data":"78d04e089cea5bc78110518bbac8988f65fff4603efa737c4e33f32b8e00de15"} Dec 02 10:19:33 crc kubenswrapper[4781]: I1202 10:19:33.117770 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-khwt8" podStartSLOduration=2.62202547 podStartE2EDuration="5.117750707s" podCreationTimestamp="2025-12-02 10:19:28 +0000 UTC" firstStartedPulling="2025-12-02 10:19:30.058158847 +0000 UTC m=+3532.882032736" lastFinishedPulling="2025-12-02 10:19:32.553884094 +0000 UTC m=+3535.377757973" observedRunningTime="2025-12-02 10:19:33.112168807 +0000 UTC m=+3535.936042686" watchObservedRunningTime="2025-12-02 10:19:33.117750707 +0000 UTC m=+3535.941624586" Dec 02 10:19:37 crc kubenswrapper[4781]: I1202 10:19:37.276599 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dhncm"] Dec 02 10:19:37 crc kubenswrapper[4781]: I1202 10:19:37.282954 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dhncm" Dec 02 10:19:37 crc kubenswrapper[4781]: I1202 10:19:37.296109 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndzx7\" (UniqueName: \"kubernetes.io/projected/8267358b-15b0-44fd-bd0a-73438bcd7ded-kube-api-access-ndzx7\") pod \"certified-operators-dhncm\" (UID: \"8267358b-15b0-44fd-bd0a-73438bcd7ded\") " pod="openshift-marketplace/certified-operators-dhncm" Dec 02 10:19:37 crc kubenswrapper[4781]: I1202 10:19:37.296197 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8267358b-15b0-44fd-bd0a-73438bcd7ded-utilities\") pod \"certified-operators-dhncm\" (UID: \"8267358b-15b0-44fd-bd0a-73438bcd7ded\") " pod="openshift-marketplace/certified-operators-dhncm" Dec 02 10:19:37 crc kubenswrapper[4781]: I1202 10:19:37.296272 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8267358b-15b0-44fd-bd0a-73438bcd7ded-catalog-content\") pod \"certified-operators-dhncm\" (UID: \"8267358b-15b0-44fd-bd0a-73438bcd7ded\") " pod="openshift-marketplace/certified-operators-dhncm" Dec 02 10:19:37 crc kubenswrapper[4781]: I1202 10:19:37.339495 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dhncm"] Dec 02 10:19:37 crc kubenswrapper[4781]: I1202 10:19:37.397727 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndzx7\" (UniqueName: \"kubernetes.io/projected/8267358b-15b0-44fd-bd0a-73438bcd7ded-kube-api-access-ndzx7\") pod \"certified-operators-dhncm\" (UID: \"8267358b-15b0-44fd-bd0a-73438bcd7ded\") " pod="openshift-marketplace/certified-operators-dhncm" Dec 02 10:19:37 crc kubenswrapper[4781]: I1202 10:19:37.397792 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8267358b-15b0-44fd-bd0a-73438bcd7ded-utilities\") pod \"certified-operators-dhncm\" (UID: \"8267358b-15b0-44fd-bd0a-73438bcd7ded\") " pod="openshift-marketplace/certified-operators-dhncm" Dec 02 10:19:37 crc kubenswrapper[4781]: I1202 10:19:37.397862 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8267358b-15b0-44fd-bd0a-73438bcd7ded-catalog-content\") pod \"certified-operators-dhncm\" (UID: \"8267358b-15b0-44fd-bd0a-73438bcd7ded\") " pod="openshift-marketplace/certified-operators-dhncm" Dec 02 10:19:37 crc kubenswrapper[4781]: I1202 10:19:37.398283 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8267358b-15b0-44fd-bd0a-73438bcd7ded-utilities\") pod \"certified-operators-dhncm\" (UID: \"8267358b-15b0-44fd-bd0a-73438bcd7ded\") " pod="openshift-marketplace/certified-operators-dhncm" Dec 02 10:19:37 crc kubenswrapper[4781]: I1202 10:19:37.398312 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8267358b-15b0-44fd-bd0a-73438bcd7ded-catalog-content\") pod \"certified-operators-dhncm\" (UID: \"8267358b-15b0-44fd-bd0a-73438bcd7ded\") " pod="openshift-marketplace/certified-operators-dhncm" Dec 02 10:19:37 crc kubenswrapper[4781]: I1202 10:19:37.424014 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndzx7\" (UniqueName: \"kubernetes.io/projected/8267358b-15b0-44fd-bd0a-73438bcd7ded-kube-api-access-ndzx7\") pod \"certified-operators-dhncm\" (UID: \"8267358b-15b0-44fd-bd0a-73438bcd7ded\") " pod="openshift-marketplace/certified-operators-dhncm" Dec 02 10:19:37 crc kubenswrapper[4781]: I1202 10:19:37.645285 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dhncm" Dec 02 10:19:38 crc kubenswrapper[4781]: I1202 10:19:38.180459 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dhncm"] Dec 02 10:19:38 crc kubenswrapper[4781]: I1202 10:19:38.493993 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-khwt8" Dec 02 10:19:38 crc kubenswrapper[4781]: I1202 10:19:38.494518 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-khwt8" Dec 02 10:19:38 crc kubenswrapper[4781]: I1202 10:19:38.587965 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-khwt8" Dec 02 10:19:39 crc kubenswrapper[4781]: I1202 10:19:39.142824 4781 generic.go:334] "Generic (PLEG): container finished" podID="8267358b-15b0-44fd-bd0a-73438bcd7ded" containerID="8fe393ecbb6dafbfcbe2eb431f539c16e347917ed70eccd7d7eaef931c2d6d5b" exitCode=0 Dec 02 10:19:39 crc kubenswrapper[4781]: I1202 10:19:39.142881 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dhncm" event={"ID":"8267358b-15b0-44fd-bd0a-73438bcd7ded","Type":"ContainerDied","Data":"8fe393ecbb6dafbfcbe2eb431f539c16e347917ed70eccd7d7eaef931c2d6d5b"} Dec 02 10:19:39 crc kubenswrapper[4781]: I1202 10:19:39.143211 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dhncm" event={"ID":"8267358b-15b0-44fd-bd0a-73438bcd7ded","Type":"ContainerStarted","Data":"e402c6f1c4f8c7980bb21f3beea83acf817f06c181a79d1cd461e36902a81817"} Dec 02 10:19:39 crc kubenswrapper[4781]: I1202 10:19:39.227059 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-khwt8" Dec 02 10:19:40 crc kubenswrapper[4781]: I1202 10:19:40.854500 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-khwt8"] Dec 02 10:19:41 crc kubenswrapper[4781]: I1202 10:19:41.161600 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-khwt8" podUID="c84f12f5-b188-4f66-904e-fe5afd92fcec" containerName="registry-server" containerID="cri-o://78d04e089cea5bc78110518bbac8988f65fff4603efa737c4e33f32b8e00de15" gracePeriod=2 Dec 02 10:19:41 crc kubenswrapper[4781]: I1202 10:19:41.677845 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-khwt8" Dec 02 10:19:41 crc kubenswrapper[4781]: I1202 10:19:41.790679 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c84f12f5-b188-4f66-904e-fe5afd92fcec-catalog-content\") pod \"c84f12f5-b188-4f66-904e-fe5afd92fcec\" (UID: \"c84f12f5-b188-4f66-904e-fe5afd92fcec\") " Dec 02 10:19:41 crc kubenswrapper[4781]: I1202 10:19:41.790747 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c84f12f5-b188-4f66-904e-fe5afd92fcec-utilities\") pod \"c84f12f5-b188-4f66-904e-fe5afd92fcec\" (UID: \"c84f12f5-b188-4f66-904e-fe5afd92fcec\") " Dec 02 10:19:41 crc kubenswrapper[4781]: I1202 10:19:41.790839 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp7zh\" (UniqueName: \"kubernetes.io/projected/c84f12f5-b188-4f66-904e-fe5afd92fcec-kube-api-access-dp7zh\") pod \"c84f12f5-b188-4f66-904e-fe5afd92fcec\" (UID: \"c84f12f5-b188-4f66-904e-fe5afd92fcec\") " Dec 02 10:19:41 crc kubenswrapper[4781]: I1202 10:19:41.792672 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c84f12f5-b188-4f66-904e-fe5afd92fcec-utilities" (OuterVolumeSpecName: "utilities") pod "c84f12f5-b188-4f66-904e-fe5afd92fcec" (UID: "c84f12f5-b188-4f66-904e-fe5afd92fcec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:19:41 crc kubenswrapper[4781]: I1202 10:19:41.801147 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c84f12f5-b188-4f66-904e-fe5afd92fcec-kube-api-access-dp7zh" (OuterVolumeSpecName: "kube-api-access-dp7zh") pod "c84f12f5-b188-4f66-904e-fe5afd92fcec" (UID: "c84f12f5-b188-4f66-904e-fe5afd92fcec"). InnerVolumeSpecName "kube-api-access-dp7zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:19:41 crc kubenswrapper[4781]: I1202 10:19:41.809908 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c84f12f5-b188-4f66-904e-fe5afd92fcec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c84f12f5-b188-4f66-904e-fe5afd92fcec" (UID: "c84f12f5-b188-4f66-904e-fe5afd92fcec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:19:41 crc kubenswrapper[4781]: I1202 10:19:41.893043 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp7zh\" (UniqueName: \"kubernetes.io/projected/c84f12f5-b188-4f66-904e-fe5afd92fcec-kube-api-access-dp7zh\") on node \"crc\" DevicePath \"\"" Dec 02 10:19:41 crc kubenswrapper[4781]: I1202 10:19:41.893081 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c84f12f5-b188-4f66-904e-fe5afd92fcec-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:19:41 crc kubenswrapper[4781]: I1202 10:19:41.893095 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c84f12f5-b188-4f66-904e-fe5afd92fcec-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:19:42 crc kubenswrapper[4781]: I1202 10:19:42.171120 4781 generic.go:334] "Generic (PLEG): container finished" podID="c84f12f5-b188-4f66-904e-fe5afd92fcec" containerID="78d04e089cea5bc78110518bbac8988f65fff4603efa737c4e33f32b8e00de15" exitCode=0 Dec 02 10:19:42 crc kubenswrapper[4781]: I1202 10:19:42.171164 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khwt8" event={"ID":"c84f12f5-b188-4f66-904e-fe5afd92fcec","Type":"ContainerDied","Data":"78d04e089cea5bc78110518bbac8988f65fff4603efa737c4e33f32b8e00de15"} Dec 02 10:19:42 crc kubenswrapper[4781]: I1202 10:19:42.171217 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khwt8" event={"ID":"c84f12f5-b188-4f66-904e-fe5afd92fcec","Type":"ContainerDied","Data":"242ffa89a28187967d679d0f3b63b57abeaac0347113274a853e1fc7b442d6c8"} Dec 02 10:19:42 crc kubenswrapper[4781]: I1202 10:19:42.171212 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-khwt8" Dec 02 10:19:42 crc kubenswrapper[4781]: I1202 10:19:42.171238 4781 scope.go:117] "RemoveContainer" containerID="78d04e089cea5bc78110518bbac8988f65fff4603efa737c4e33f32b8e00de15" Dec 02 10:19:42 crc kubenswrapper[4781]: I1202 10:19:42.205799 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-khwt8"] Dec 02 10:19:42 crc kubenswrapper[4781]: I1202 10:19:42.215719 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-khwt8"] Dec 02 10:19:43 crc kubenswrapper[4781]: I1202 10:19:43.526098 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c84f12f5-b188-4f66-904e-fe5afd92fcec" path="/var/lib/kubelet/pods/c84f12f5-b188-4f66-904e-fe5afd92fcec/volumes" Dec 02 10:19:45 crc kubenswrapper[4781]: I1202 10:19:45.306585 4781 scope.go:117] "RemoveContainer" containerID="aa1f91f65c2f081968ea3839cc434fd825fdcb506d57bcbd7ceac5256bca3fa5" Dec 02 10:19:45 crc kubenswrapper[4781]: I1202 10:19:45.369872 4781 scope.go:117] "RemoveContainer" containerID="a95ecf9598e108681a1d33007f5a088810b869d31b4d0ff913bcbaa855706e63" Dec 02 10:19:45 crc kubenswrapper[4781]: I1202 10:19:45.421026 4781 scope.go:117] "RemoveContainer" containerID="78d04e089cea5bc78110518bbac8988f65fff4603efa737c4e33f32b8e00de15" Dec 02 10:19:45 crc kubenswrapper[4781]: E1202 10:19:45.421624 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78d04e089cea5bc78110518bbac8988f65fff4603efa737c4e33f32b8e00de15\": container with ID starting with 78d04e089cea5bc78110518bbac8988f65fff4603efa737c4e33f32b8e00de15 not found: ID does not exist" containerID="78d04e089cea5bc78110518bbac8988f65fff4603efa737c4e33f32b8e00de15" Dec 02 10:19:45 crc kubenswrapper[4781]: I1202 10:19:45.421700 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d04e089cea5bc78110518bbac8988f65fff4603efa737c4e33f32b8e00de15"} err="failed to get container status \"78d04e089cea5bc78110518bbac8988f65fff4603efa737c4e33f32b8e00de15\": rpc error: code = NotFound desc = could not find container \"78d04e089cea5bc78110518bbac8988f65fff4603efa737c4e33f32b8e00de15\": container with ID starting with 78d04e089cea5bc78110518bbac8988f65fff4603efa737c4e33f32b8e00de15 not found: ID does not exist" Dec 02 10:19:45 crc kubenswrapper[4781]: I1202 10:19:45.421753 4781 scope.go:117] "RemoveContainer" containerID="aa1f91f65c2f081968ea3839cc434fd825fdcb506d57bcbd7ceac5256bca3fa5" Dec 02 10:19:45 crc kubenswrapper[4781]: E1202 10:19:45.422370 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa1f91f65c2f081968ea3839cc434fd825fdcb506d57bcbd7ceac5256bca3fa5\": container with ID starting with aa1f91f65c2f081968ea3839cc434fd825fdcb506d57bcbd7ceac5256bca3fa5 not found: ID does not exist" containerID="aa1f91f65c2f081968ea3839cc434fd825fdcb506d57bcbd7ceac5256bca3fa5" Dec 02 10:19:45 crc kubenswrapper[4781]: I1202 10:19:45.422430 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa1f91f65c2f081968ea3839cc434fd825fdcb506d57bcbd7ceac5256bca3fa5"} err="failed to get container status \"aa1f91f65c2f081968ea3839cc434fd825fdcb506d57bcbd7ceac5256bca3fa5\": rpc error: code = NotFound desc = could not find container \"aa1f91f65c2f081968ea3839cc434fd825fdcb506d57bcbd7ceac5256bca3fa5\": container with ID starting with aa1f91f65c2f081968ea3839cc434fd825fdcb506d57bcbd7ceac5256bca3fa5 not found: ID does not exist" Dec 02 10:19:45 crc kubenswrapper[4781]: I1202 10:19:45.422469 4781 scope.go:117] "RemoveContainer" containerID="a95ecf9598e108681a1d33007f5a088810b869d31b4d0ff913bcbaa855706e63" Dec 02 10:19:45 crc kubenswrapper[4781]: E1202 10:19:45.422959 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a95ecf9598e108681a1d33007f5a088810b869d31b4d0ff913bcbaa855706e63\": container with ID starting with a95ecf9598e108681a1d33007f5a088810b869d31b4d0ff913bcbaa855706e63 not found: ID does not exist" containerID="a95ecf9598e108681a1d33007f5a088810b869d31b4d0ff913bcbaa855706e63" Dec 02 10:19:45 crc kubenswrapper[4781]: I1202 10:19:45.423007 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a95ecf9598e108681a1d33007f5a088810b869d31b4d0ff913bcbaa855706e63"} err="failed to get container status \"a95ecf9598e108681a1d33007f5a088810b869d31b4d0ff913bcbaa855706e63\": rpc error: code = NotFound desc = could not find container \"a95ecf9598e108681a1d33007f5a088810b869d31b4d0ff913bcbaa855706e63\": container with ID starting with a95ecf9598e108681a1d33007f5a088810b869d31b4d0ff913bcbaa855706e63 not found: ID does not exist" Dec 02 10:19:46 crc kubenswrapper[4781]: I1202 10:19:46.220406 4781 generic.go:334] "Generic (PLEG): container finished" podID="8267358b-15b0-44fd-bd0a-73438bcd7ded" containerID="49b642d7cca7075e2375058a91be692eb59afdd59fff9a9cb15fa2f36b555ccc" exitCode=0 Dec 02 10:19:46 crc kubenswrapper[4781]: I1202 10:19:46.220466 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dhncm" event={"ID":"8267358b-15b0-44fd-bd0a-73438bcd7ded","Type":"ContainerDied","Data":"49b642d7cca7075e2375058a91be692eb59afdd59fff9a9cb15fa2f36b555ccc"} Dec 02 10:19:47 crc kubenswrapper[4781]: I1202 10:19:47.230378 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dhncm" event={"ID":"8267358b-15b0-44fd-bd0a-73438bcd7ded","Type":"ContainerStarted","Data":"a29bfbfd62f371cacdba7b454fd95feb9ae54ae58d7b40a28be118fda6b6fa48"} Dec 02 10:19:47 crc kubenswrapper[4781]: I1202 10:19:47.249623 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dhncm" podStartSLOduration=2.600987834 podStartE2EDuration="10.24960309s" podCreationTimestamp="2025-12-02 10:19:37 +0000 UTC" firstStartedPulling="2025-12-02 10:19:39.145856567 +0000 UTC m=+3541.969730446" lastFinishedPulling="2025-12-02 10:19:46.794471823 +0000 UTC m=+3549.618345702" observedRunningTime="2025-12-02 10:19:47.246814195 +0000 UTC m=+3550.070688074" watchObservedRunningTime="2025-12-02 10:19:47.24960309 +0000 UTC m=+3550.073476969" Dec 02 10:19:47 crc kubenswrapper[4781]: I1202 10:19:47.645702 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dhncm" Dec 02 10:19:47 crc kubenswrapper[4781]: I1202 10:19:47.646077 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dhncm" Dec 02 10:19:48 crc kubenswrapper[4781]: I1202 10:19:48.805860 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dhncm" podUID="8267358b-15b0-44fd-bd0a-73438bcd7ded" containerName="registry-server" probeResult="failure" output=< Dec 02 10:19:48 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Dec 02 10:19:48 crc kubenswrapper[4781]: > Dec 02 10:19:57 crc kubenswrapper[4781]: I1202 10:19:57.692851 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dhncm" Dec 02 10:19:57 crc kubenswrapper[4781]: I1202 10:19:57.771404 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dhncm" Dec 02 10:19:57 crc kubenswrapper[4781]: I1202 10:19:57.849675 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dhncm"] Dec 02 10:19:57 crc kubenswrapper[4781]: I1202 10:19:57.926845 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tzs7n"] Dec 02 10:19:57 crc kubenswrapper[4781]: I1202 10:19:57.927156 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tzs7n" podUID="14c3d9a9-9948-4be9-86a3-9def6cce4450" containerName="registry-server" containerID="cri-o://08d24f041d37fcd264fe63d3cb0ac6900cd974e11d0073e7bf738cb6b715ce59" gracePeriod=2 Dec 02 10:19:58 crc kubenswrapper[4781]: I1202 10:19:58.353560 4781 generic.go:334] "Generic (PLEG): container finished" podID="14c3d9a9-9948-4be9-86a3-9def6cce4450" containerID="08d24f041d37fcd264fe63d3cb0ac6900cd974e11d0073e7bf738cb6b715ce59" exitCode=0 Dec 02 10:19:58 crc kubenswrapper[4781]: I1202 10:19:58.354357 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzs7n" event={"ID":"14c3d9a9-9948-4be9-86a3-9def6cce4450","Type":"ContainerDied","Data":"08d24f041d37fcd264fe63d3cb0ac6900cd974e11d0073e7bf738cb6b715ce59"} Dec 02 10:19:58 crc kubenswrapper[4781]: I1202 10:19:58.411130 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzs7n" Dec 02 10:19:58 crc kubenswrapper[4781]: I1202 10:19:58.518779 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14c3d9a9-9948-4be9-86a3-9def6cce4450-utilities\") pod \"14c3d9a9-9948-4be9-86a3-9def6cce4450\" (UID: \"14c3d9a9-9948-4be9-86a3-9def6cce4450\") " Dec 02 10:19:58 crc kubenswrapper[4781]: I1202 10:19:58.518857 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14c3d9a9-9948-4be9-86a3-9def6cce4450-catalog-content\") pod \"14c3d9a9-9948-4be9-86a3-9def6cce4450\" (UID: \"14c3d9a9-9948-4be9-86a3-9def6cce4450\") " Dec 02 10:19:58 crc kubenswrapper[4781]: I1202 10:19:58.519037 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wktj4\" (UniqueName: \"kubernetes.io/projected/14c3d9a9-9948-4be9-86a3-9def6cce4450-kube-api-access-wktj4\") pod \"14c3d9a9-9948-4be9-86a3-9def6cce4450\" (UID: \"14c3d9a9-9948-4be9-86a3-9def6cce4450\") " Dec 02 10:19:58 crc kubenswrapper[4781]: I1202 10:19:58.519710 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14c3d9a9-9948-4be9-86a3-9def6cce4450-utilities" (OuterVolumeSpecName: "utilities") pod "14c3d9a9-9948-4be9-86a3-9def6cce4450" (UID: "14c3d9a9-9948-4be9-86a3-9def6cce4450"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:19:58 crc kubenswrapper[4781]: I1202 10:19:58.539001 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14c3d9a9-9948-4be9-86a3-9def6cce4450-kube-api-access-wktj4" (OuterVolumeSpecName: "kube-api-access-wktj4") pod "14c3d9a9-9948-4be9-86a3-9def6cce4450" (UID: "14c3d9a9-9948-4be9-86a3-9def6cce4450"). InnerVolumeSpecName "kube-api-access-wktj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:19:58 crc kubenswrapper[4781]: I1202 10:19:58.594078 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14c3d9a9-9948-4be9-86a3-9def6cce4450-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14c3d9a9-9948-4be9-86a3-9def6cce4450" (UID: "14c3d9a9-9948-4be9-86a3-9def6cce4450"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:19:58 crc kubenswrapper[4781]: I1202 10:19:58.621465 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14c3d9a9-9948-4be9-86a3-9def6cce4450-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:19:58 crc kubenswrapper[4781]: I1202 10:19:58.621503 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14c3d9a9-9948-4be9-86a3-9def6cce4450-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:19:58 crc kubenswrapper[4781]: I1202 10:19:58.621516 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wktj4\" (UniqueName: \"kubernetes.io/projected/14c3d9a9-9948-4be9-86a3-9def6cce4450-kube-api-access-wktj4\") on node \"crc\" DevicePath \"\"" Dec 02 10:19:59 crc kubenswrapper[4781]: I1202 10:19:59.364836 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzs7n" event={"ID":"14c3d9a9-9948-4be9-86a3-9def6cce4450","Type":"ContainerDied","Data":"79e40c8c56afe289cd55ddd6eb3e6c60fc704ee207d9eeda445332b48a2e137d"} Dec 02 10:19:59 crc kubenswrapper[4781]: I1202 10:19:59.365845 4781 scope.go:117] "RemoveContainer" containerID="08d24f041d37fcd264fe63d3cb0ac6900cd974e11d0073e7bf738cb6b715ce59" Dec 02 10:19:59 crc kubenswrapper[4781]: I1202 10:19:59.364871 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzs7n" Dec 02 10:19:59 crc kubenswrapper[4781]: I1202 10:19:59.396648 4781 scope.go:117] "RemoveContainer" containerID="1313486f231018f4faf9b7694fc510af10d5ec91241410920d752ef8a2080ca8" Dec 02 10:19:59 crc kubenswrapper[4781]: I1202 10:19:59.399812 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tzs7n"] Dec 02 10:19:59 crc kubenswrapper[4781]: I1202 10:19:59.409848 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tzs7n"] Dec 02 10:19:59 crc kubenswrapper[4781]: I1202 10:19:59.426640 4781 scope.go:117] "RemoveContainer" containerID="1c1fd6c79e7935b6d5186463318a686aa33d68009044bceae44d10290d59e4e7" Dec 02 10:19:59 crc kubenswrapper[4781]: I1202 10:19:59.513288 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14c3d9a9-9948-4be9-86a3-9def6cce4450" path="/var/lib/kubelet/pods/14c3d9a9-9948-4be9-86a3-9def6cce4450/volumes" Dec 02 10:21:30 crc kubenswrapper[4781]: I1202 10:21:30.412512 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:21:30 crc kubenswrapper[4781]: I1202 10:21:30.413020 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.335318 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bspmr"] Dec 02 10:22:00 crc kubenswrapper[4781]: E1202 10:22:00.336198 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c84f12f5-b188-4f66-904e-fe5afd92fcec" containerName="extract-utilities" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.336216 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c84f12f5-b188-4f66-904e-fe5afd92fcec" containerName="extract-utilities" Dec 02 10:22:00 crc kubenswrapper[4781]: E1202 10:22:00.336256 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c84f12f5-b188-4f66-904e-fe5afd92fcec" containerName="extract-content" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.336265 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c84f12f5-b188-4f66-904e-fe5afd92fcec" containerName="extract-content" Dec 02 10:22:00 crc kubenswrapper[4781]: E1202 10:22:00.336286 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14c3d9a9-9948-4be9-86a3-9def6cce4450" containerName="extract-content" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.336294 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="14c3d9a9-9948-4be9-86a3-9def6cce4450" containerName="extract-content" Dec 02 10:22:00 crc kubenswrapper[4781]: E1202 10:22:00.336309 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c84f12f5-b188-4f66-904e-fe5afd92fcec" containerName="registry-server" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.336316 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c84f12f5-b188-4f66-904e-fe5afd92fcec" containerName="registry-server" Dec 02 10:22:00 crc kubenswrapper[4781]: E1202 10:22:00.336330 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14c3d9a9-9948-4be9-86a3-9def6cce4450" containerName="registry-server" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.336340 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="14c3d9a9-9948-4be9-86a3-9def6cce4450" containerName="registry-server" Dec 02 10:22:00 crc kubenswrapper[4781]: E1202 10:22:00.336356 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14c3d9a9-9948-4be9-86a3-9def6cce4450" containerName="extract-utilities" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.336364 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="14c3d9a9-9948-4be9-86a3-9def6cce4450" containerName="extract-utilities" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.336589 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="14c3d9a9-9948-4be9-86a3-9def6cce4450" containerName="registry-server" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.336615 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c84f12f5-b188-4f66-904e-fe5afd92fcec" containerName="registry-server" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.338274 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bspmr" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.355331 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bspmr"] Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.412620 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.412695 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.419792 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79988a9b-34bb-4395-b2a4-748fa3a546cc-utilities\") pod \"community-operators-bspmr\" (UID: \"79988a9b-34bb-4395-b2a4-748fa3a546cc\") " pod="openshift-marketplace/community-operators-bspmr" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.419988 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79988a9b-34bb-4395-b2a4-748fa3a546cc-catalog-content\") pod \"community-operators-bspmr\" (UID: \"79988a9b-34bb-4395-b2a4-748fa3a546cc\") " pod="openshift-marketplace/community-operators-bspmr" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.420143 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8nz9\" (UniqueName: \"kubernetes.io/projected/79988a9b-34bb-4395-b2a4-748fa3a546cc-kube-api-access-z8nz9\") pod \"community-operators-bspmr\" (UID: \"79988a9b-34bb-4395-b2a4-748fa3a546cc\") " pod="openshift-marketplace/community-operators-bspmr" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.522323 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8nz9\" (UniqueName: \"kubernetes.io/projected/79988a9b-34bb-4395-b2a4-748fa3a546cc-kube-api-access-z8nz9\") pod \"community-operators-bspmr\" (UID: \"79988a9b-34bb-4395-b2a4-748fa3a546cc\") " pod="openshift-marketplace/community-operators-bspmr" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.522485 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79988a9b-34bb-4395-b2a4-748fa3a546cc-utilities\") pod \"community-operators-bspmr\" (UID: \"79988a9b-34bb-4395-b2a4-748fa3a546cc\") " pod="openshift-marketplace/community-operators-bspmr" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.522517 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79988a9b-34bb-4395-b2a4-748fa3a546cc-catalog-content\") pod \"community-operators-bspmr\" (UID: \"79988a9b-34bb-4395-b2a4-748fa3a546cc\") " pod="openshift-marketplace/community-operators-bspmr" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.523008 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79988a9b-34bb-4395-b2a4-748fa3a546cc-utilities\") pod \"community-operators-bspmr\" (UID: \"79988a9b-34bb-4395-b2a4-748fa3a546cc\") " pod="openshift-marketplace/community-operators-bspmr" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.523043 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79988a9b-34bb-4395-b2a4-748fa3a546cc-catalog-content\") pod \"community-operators-bspmr\" (UID: \"79988a9b-34bb-4395-b2a4-748fa3a546cc\") " pod="openshift-marketplace/community-operators-bspmr" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.547771 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8nz9\" (UniqueName: \"kubernetes.io/projected/79988a9b-34bb-4395-b2a4-748fa3a546cc-kube-api-access-z8nz9\") pod \"community-operators-bspmr\" (UID: \"79988a9b-34bb-4395-b2a4-748fa3a546cc\") " pod="openshift-marketplace/community-operators-bspmr" Dec 02 10:22:00 crc kubenswrapper[4781]: I1202 10:22:00.658131 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bspmr" Dec 02 10:22:01 crc kubenswrapper[4781]: I1202 10:22:01.160439 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bspmr"] Dec 02 10:22:01 crc kubenswrapper[4781]: W1202 10:22:01.167248 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79988a9b_34bb_4395_b2a4_748fa3a546cc.slice/crio-9d983fd99246522ceadc128853e10c1c8a91ec0262b861b2f39d099711fd30fd WatchSource:0}: Error finding container 9d983fd99246522ceadc128853e10c1c8a91ec0262b861b2f39d099711fd30fd: Status 404 returned error can't find the container with id 9d983fd99246522ceadc128853e10c1c8a91ec0262b861b2f39d099711fd30fd Dec 02 10:22:01 crc kubenswrapper[4781]: I1202 10:22:01.666363 4781 generic.go:334] "Generic (PLEG): container finished" podID="79988a9b-34bb-4395-b2a4-748fa3a546cc" containerID="73a44426c99220fc3654b7be3e01c8f59dda8097cb4adfe4c31c22cb0bb31dd8" exitCode=0 Dec 02 10:22:01 crc kubenswrapper[4781]: I1202 10:22:01.666466 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bspmr" event={"ID":"79988a9b-34bb-4395-b2a4-748fa3a546cc","Type":"ContainerDied","Data":"73a44426c99220fc3654b7be3e01c8f59dda8097cb4adfe4c31c22cb0bb31dd8"} Dec 02 10:22:01 crc kubenswrapper[4781]: I1202 10:22:01.666834 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bspmr" event={"ID":"79988a9b-34bb-4395-b2a4-748fa3a546cc","Type":"ContainerStarted","Data":"9d983fd99246522ceadc128853e10c1c8a91ec0262b861b2f39d099711fd30fd"} Dec 02 10:22:02 crc kubenswrapper[4781]: I1202 10:22:02.677837 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bspmr" event={"ID":"79988a9b-34bb-4395-b2a4-748fa3a546cc","Type":"ContainerStarted","Data":"742e01d468a50eda9ffdb36eb413234d491fac8f8757606046bc0c1c7193257c"} Dec 02 10:22:03 crc kubenswrapper[4781]: I1202 10:22:03.935898 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zqz78"] Dec 02 10:22:03 crc kubenswrapper[4781]: I1202 10:22:03.938542 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zqz78" Dec 02 10:22:03 crc kubenswrapper[4781]: I1202 10:22:03.952962 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zqz78"] Dec 02 10:22:04 crc kubenswrapper[4781]: I1202 10:22:04.004890 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e32d887a-3e2b-4699-bfa0-d79379140ef5-catalog-content\") pod \"redhat-operators-zqz78\" (UID: \"e32d887a-3e2b-4699-bfa0-d79379140ef5\") " pod="openshift-marketplace/redhat-operators-zqz78" Dec 02 10:22:04 crc kubenswrapper[4781]: I1202 10:22:04.005023 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e32d887a-3e2b-4699-bfa0-d79379140ef5-utilities\") pod \"redhat-operators-zqz78\" (UID: \"e32d887a-3e2b-4699-bfa0-d79379140ef5\") " pod="openshift-marketplace/redhat-operators-zqz78" Dec 02 10:22:04 crc kubenswrapper[4781]: I1202 10:22:04.005074 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8d5m\" (UniqueName: \"kubernetes.io/projected/e32d887a-3e2b-4699-bfa0-d79379140ef5-kube-api-access-k8d5m\") pod \"redhat-operators-zqz78\" (UID: \"e32d887a-3e2b-4699-bfa0-d79379140ef5\") " pod="openshift-marketplace/redhat-operators-zqz78" Dec 02 10:22:04 crc kubenswrapper[4781]: I1202 10:22:04.106748 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e32d887a-3e2b-4699-bfa0-d79379140ef5-catalog-content\") pod \"redhat-operators-zqz78\" (UID: \"e32d887a-3e2b-4699-bfa0-d79379140ef5\") " pod="openshift-marketplace/redhat-operators-zqz78" Dec 02 10:22:04 crc kubenswrapper[4781]: I1202 10:22:04.106824 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e32d887a-3e2b-4699-bfa0-d79379140ef5-utilities\") pod \"redhat-operators-zqz78\" (UID: \"e32d887a-3e2b-4699-bfa0-d79379140ef5\") " pod="openshift-marketplace/redhat-operators-zqz78" Dec 02 10:22:04 crc kubenswrapper[4781]: I1202 10:22:04.106881 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8d5m\" (UniqueName: \"kubernetes.io/projected/e32d887a-3e2b-4699-bfa0-d79379140ef5-kube-api-access-k8d5m\") pod \"redhat-operators-zqz78\" (UID: \"e32d887a-3e2b-4699-bfa0-d79379140ef5\") " pod="openshift-marketplace/redhat-operators-zqz78" Dec 02 10:22:04 crc kubenswrapper[4781]: I1202 10:22:04.107324 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e32d887a-3e2b-4699-bfa0-d79379140ef5-catalog-content\") pod \"redhat-operators-zqz78\" (UID: \"e32d887a-3e2b-4699-bfa0-d79379140ef5\") " pod="openshift-marketplace/redhat-operators-zqz78" Dec 02 10:22:04 crc kubenswrapper[4781]: I1202 10:22:04.107417 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e32d887a-3e2b-4699-bfa0-d79379140ef5-utilities\") pod \"redhat-operators-zqz78\" (UID: \"e32d887a-3e2b-4699-bfa0-d79379140ef5\") " pod="openshift-marketplace/redhat-operators-zqz78" Dec 02 10:22:04 crc kubenswrapper[4781]: I1202 10:22:04.130425 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8d5m\" (UniqueName: \"kubernetes.io/projected/e32d887a-3e2b-4699-bfa0-d79379140ef5-kube-api-access-k8d5m\") pod \"redhat-operators-zqz78\" (UID: \"e32d887a-3e2b-4699-bfa0-d79379140ef5\") " pod="openshift-marketplace/redhat-operators-zqz78" Dec 02 10:22:04 crc kubenswrapper[4781]: I1202 10:22:04.255756 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zqz78" Dec 02 10:22:04 crc kubenswrapper[4781]: I1202 10:22:04.710604 4781 generic.go:334] "Generic (PLEG): container finished" podID="79988a9b-34bb-4395-b2a4-748fa3a546cc" containerID="742e01d468a50eda9ffdb36eb413234d491fac8f8757606046bc0c1c7193257c" exitCode=0 Dec 02 10:22:04 crc kubenswrapper[4781]: I1202 10:22:04.710850 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bspmr" event={"ID":"79988a9b-34bb-4395-b2a4-748fa3a546cc","Type":"ContainerDied","Data":"742e01d468a50eda9ffdb36eb413234d491fac8f8757606046bc0c1c7193257c"} Dec 02 10:22:04 crc kubenswrapper[4781]: I1202 10:22:04.780113 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zqz78"] Dec 02 10:22:05 crc kubenswrapper[4781]: I1202 10:22:05.721012 4781 generic.go:334] "Generic (PLEG): container finished" podID="e32d887a-3e2b-4699-bfa0-d79379140ef5" containerID="bb8558e0e49347600ebe05971e66fb835e7f3c4e3cbd26cfbd5709e9323be6a2" exitCode=0 Dec 02 10:22:05 crc kubenswrapper[4781]: I1202 10:22:05.721078 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zqz78" event={"ID":"e32d887a-3e2b-4699-bfa0-d79379140ef5","Type":"ContainerDied","Data":"bb8558e0e49347600ebe05971e66fb835e7f3c4e3cbd26cfbd5709e9323be6a2"} Dec 02 10:22:05 crc kubenswrapper[4781]: I1202 10:22:05.721639 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zqz78" event={"ID":"e32d887a-3e2b-4699-bfa0-d79379140ef5","Type":"ContainerStarted","Data":"852cb23c25885deb181822c6e7a5619d9dc01bc951a1f661bf74fc6e55a57b46"} Dec 02 10:22:05 crc kubenswrapper[4781]: I1202 10:22:05.726062 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bspmr" event={"ID":"79988a9b-34bb-4395-b2a4-748fa3a546cc","Type":"ContainerStarted","Data":"ecd073e990762f4c43ac70eab6b45235de41fd956a7d7a1c094eb67df844abe5"} Dec 02 10:22:05 crc kubenswrapper[4781]: I1202 10:22:05.777793 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bspmr" podStartSLOduration=2.294260427 podStartE2EDuration="5.777776305s" podCreationTimestamp="2025-12-02 10:22:00 +0000 UTC" firstStartedPulling="2025-12-02 10:22:01.670332029 +0000 UTC m=+3684.494205908" lastFinishedPulling="2025-12-02 10:22:05.153847907 +0000 UTC m=+3687.977721786" observedRunningTime="2025-12-02 10:22:05.764428817 +0000 UTC m=+3688.588302696" watchObservedRunningTime="2025-12-02 10:22:05.777776305 +0000 UTC m=+3688.601650184" Dec 02 10:22:06 crc kubenswrapper[4781]: I1202 10:22:06.758090 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zqz78" event={"ID":"e32d887a-3e2b-4699-bfa0-d79379140ef5","Type":"ContainerStarted","Data":"7e3ebfddf0449d8c72d6519097f79d3b7de4df0b3cb1658b2e3450ee59e0df2c"} Dec 02 10:22:08 crc kubenswrapper[4781]: I1202 10:22:08.777903 4781 generic.go:334] "Generic (PLEG): container finished" podID="e32d887a-3e2b-4699-bfa0-d79379140ef5" containerID="7e3ebfddf0449d8c72d6519097f79d3b7de4df0b3cb1658b2e3450ee59e0df2c" exitCode=0 Dec 02 10:22:08 crc kubenswrapper[4781]: I1202 10:22:08.777987 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zqz78" event={"ID":"e32d887a-3e2b-4699-bfa0-d79379140ef5","Type":"ContainerDied","Data":"7e3ebfddf0449d8c72d6519097f79d3b7de4df0b3cb1658b2e3450ee59e0df2c"} Dec 02 10:22:09 crc kubenswrapper[4781]: I1202 10:22:09.791244 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zqz78" event={"ID":"e32d887a-3e2b-4699-bfa0-d79379140ef5","Type":"ContainerStarted","Data":"cb835ab47244b8ecd9d3227a55c9fd24ca256914fff6b214aacd61cb90e68433"} Dec 02 10:22:09 crc kubenswrapper[4781]: I1202 10:22:09.823684 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zqz78" podStartSLOduration=3.097226452 podStartE2EDuration="6.823662706s" podCreationTimestamp="2025-12-02 10:22:03 +0000 UTC" firstStartedPulling="2025-12-02 10:22:05.722531519 +0000 UTC m=+3688.546405388" lastFinishedPulling="2025-12-02 10:22:09.448967773 +0000 UTC m=+3692.272841642" observedRunningTime="2025-12-02 10:22:09.812621589 +0000 UTC m=+3692.636495468" watchObservedRunningTime="2025-12-02 10:22:09.823662706 +0000 UTC m=+3692.647536585" Dec 02 10:22:10 crc kubenswrapper[4781]: I1202 10:22:10.658734 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bspmr" Dec 02 10:22:10 crc kubenswrapper[4781]: I1202 10:22:10.659119 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bspmr" Dec 02 10:22:11 crc kubenswrapper[4781]: I1202 10:22:11.706496 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-bspmr" podUID="79988a9b-34bb-4395-b2a4-748fa3a546cc" containerName="registry-server" probeResult="failure" output=< Dec 02 10:22:11 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Dec 02 10:22:11 crc kubenswrapper[4781]: > Dec 02 10:22:14 crc kubenswrapper[4781]: I1202 10:22:14.255852 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zqz78" Dec 02 10:22:14 crc kubenswrapper[4781]: I1202 10:22:14.256195 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zqz78" Dec 02 10:22:15 crc kubenswrapper[4781]: I1202 10:22:15.349295 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zqz78" podUID="e32d887a-3e2b-4699-bfa0-d79379140ef5" containerName="registry-server" probeResult="failure" output=< Dec 02 10:22:15 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Dec 02 10:22:15 crc kubenswrapper[4781]: > Dec 02 10:22:20 crc kubenswrapper[4781]: I1202 10:22:20.715334 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bspmr" Dec 02 10:22:20 crc kubenswrapper[4781]: I1202 10:22:20.792438 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bspmr" Dec 02 10:22:20 crc kubenswrapper[4781]: I1202 10:22:20.952271 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bspmr"] Dec 02 10:22:21 crc kubenswrapper[4781]: I1202 10:22:21.925463 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bspmr" podUID="79988a9b-34bb-4395-b2a4-748fa3a546cc" containerName="registry-server" containerID="cri-o://ecd073e990762f4c43ac70eab6b45235de41fd956a7d7a1c094eb67df844abe5" gracePeriod=2 Dec 02 10:22:22 crc kubenswrapper[4781]: I1202 10:22:22.404505 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bspmr" Dec 02 10:22:22 crc kubenswrapper[4781]: I1202 10:22:22.470870 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79988a9b-34bb-4395-b2a4-748fa3a546cc-utilities\") pod \"79988a9b-34bb-4395-b2a4-748fa3a546cc\" (UID: \"79988a9b-34bb-4395-b2a4-748fa3a546cc\") " Dec 02 10:22:22 crc kubenswrapper[4781]: I1202 10:22:22.470962 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8nz9\" (UniqueName: \"kubernetes.io/projected/79988a9b-34bb-4395-b2a4-748fa3a546cc-kube-api-access-z8nz9\") pod \"79988a9b-34bb-4395-b2a4-748fa3a546cc\" (UID: \"79988a9b-34bb-4395-b2a4-748fa3a546cc\") " Dec 02 10:22:22 crc kubenswrapper[4781]: I1202 10:22:22.471017 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79988a9b-34bb-4395-b2a4-748fa3a546cc-catalog-content\") pod \"79988a9b-34bb-4395-b2a4-748fa3a546cc\" (UID: \"79988a9b-34bb-4395-b2a4-748fa3a546cc\") " Dec 02 10:22:22 crc kubenswrapper[4781]: I1202 10:22:22.472518 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79988a9b-34bb-4395-b2a4-748fa3a546cc-utilities" (OuterVolumeSpecName: "utilities") pod "79988a9b-34bb-4395-b2a4-748fa3a546cc" (UID: "79988a9b-34bb-4395-b2a4-748fa3a546cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:22:22 crc kubenswrapper[4781]: I1202 10:22:22.478124 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79988a9b-34bb-4395-b2a4-748fa3a546cc-kube-api-access-z8nz9" (OuterVolumeSpecName: "kube-api-access-z8nz9") pod "79988a9b-34bb-4395-b2a4-748fa3a546cc" (UID: "79988a9b-34bb-4395-b2a4-748fa3a546cc"). InnerVolumeSpecName "kube-api-access-z8nz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:22:22 crc kubenswrapper[4781]: I1202 10:22:22.535918 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79988a9b-34bb-4395-b2a4-748fa3a546cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79988a9b-34bb-4395-b2a4-748fa3a546cc" (UID: "79988a9b-34bb-4395-b2a4-748fa3a546cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:22:22 crc kubenswrapper[4781]: I1202 10:22:22.572945 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79988a9b-34bb-4395-b2a4-748fa3a546cc-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:22:22 crc kubenswrapper[4781]: I1202 10:22:22.572983 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8nz9\" (UniqueName: \"kubernetes.io/projected/79988a9b-34bb-4395-b2a4-748fa3a546cc-kube-api-access-z8nz9\") on node \"crc\" DevicePath \"\"" Dec 02 10:22:22 crc kubenswrapper[4781]: I1202 10:22:22.572991 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79988a9b-34bb-4395-b2a4-748fa3a546cc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:22:22 crc kubenswrapper[4781]: I1202 10:22:22.937157 4781 generic.go:334] "Generic (PLEG): container finished" podID="79988a9b-34bb-4395-b2a4-748fa3a546cc" containerID="ecd073e990762f4c43ac70eab6b45235de41fd956a7d7a1c094eb67df844abe5" exitCode=0 Dec 02 10:22:22 crc kubenswrapper[4781]: I1202 10:22:22.937213 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bspmr" event={"ID":"79988a9b-34bb-4395-b2a4-748fa3a546cc","Type":"ContainerDied","Data":"ecd073e990762f4c43ac70eab6b45235de41fd956a7d7a1c094eb67df844abe5"} Dec 02 10:22:22 crc kubenswrapper[4781]: I1202 10:22:22.937283 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bspmr" Dec 02 10:22:22 crc kubenswrapper[4781]: I1202 10:22:22.937495 4781 scope.go:117] "RemoveContainer" containerID="ecd073e990762f4c43ac70eab6b45235de41fd956a7d7a1c094eb67df844abe5" Dec 02 10:22:22 crc kubenswrapper[4781]: I1202 10:22:22.937479 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bspmr" event={"ID":"79988a9b-34bb-4395-b2a4-748fa3a546cc","Type":"ContainerDied","Data":"9d983fd99246522ceadc128853e10c1c8a91ec0262b861b2f39d099711fd30fd"} Dec 02 10:22:22 crc kubenswrapper[4781]: I1202 10:22:22.982319 4781 scope.go:117] "RemoveContainer" containerID="742e01d468a50eda9ffdb36eb413234d491fac8f8757606046bc0c1c7193257c" Dec 02 10:22:23 crc kubenswrapper[4781]: I1202 10:22:23.009749 4781 scope.go:117] "RemoveContainer" containerID="73a44426c99220fc3654b7be3e01c8f59dda8097cb4adfe4c31c22cb0bb31dd8" Dec 02 10:22:23 crc kubenswrapper[4781]: I1202 10:22:23.015547 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bspmr"] Dec 02 10:22:23 crc kubenswrapper[4781]: I1202 10:22:23.026357 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bspmr"] Dec 02 10:22:23 crc kubenswrapper[4781]: I1202 10:22:23.078064 4781 scope.go:117] "RemoveContainer" containerID="ecd073e990762f4c43ac70eab6b45235de41fd956a7d7a1c094eb67df844abe5" Dec 02 10:22:23 crc kubenswrapper[4781]: E1202 10:22:23.078681 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecd073e990762f4c43ac70eab6b45235de41fd956a7d7a1c094eb67df844abe5\": container with ID starting with ecd073e990762f4c43ac70eab6b45235de41fd956a7d7a1c094eb67df844abe5 not found: ID does not exist" containerID="ecd073e990762f4c43ac70eab6b45235de41fd956a7d7a1c094eb67df844abe5" Dec 02 10:22:23 crc kubenswrapper[4781]: I1202 10:22:23.078738 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecd073e990762f4c43ac70eab6b45235de41fd956a7d7a1c094eb67df844abe5"} err="failed to get container status \"ecd073e990762f4c43ac70eab6b45235de41fd956a7d7a1c094eb67df844abe5\": rpc error: code = NotFound desc = could not find container \"ecd073e990762f4c43ac70eab6b45235de41fd956a7d7a1c094eb67df844abe5\": container with ID starting with ecd073e990762f4c43ac70eab6b45235de41fd956a7d7a1c094eb67df844abe5 not found: ID does not exist" Dec 02 10:22:23 crc kubenswrapper[4781]: I1202 10:22:23.078770 4781 scope.go:117] "RemoveContainer" containerID="742e01d468a50eda9ffdb36eb413234d491fac8f8757606046bc0c1c7193257c" Dec 02 10:22:23 crc kubenswrapper[4781]: E1202 10:22:23.079240 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"742e01d468a50eda9ffdb36eb413234d491fac8f8757606046bc0c1c7193257c\": container with ID starting with 742e01d468a50eda9ffdb36eb413234d491fac8f8757606046bc0c1c7193257c not found: ID does not exist" containerID="742e01d468a50eda9ffdb36eb413234d491fac8f8757606046bc0c1c7193257c" Dec 02 10:22:23 crc kubenswrapper[4781]: I1202 10:22:23.079320 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"742e01d468a50eda9ffdb36eb413234d491fac8f8757606046bc0c1c7193257c"} err="failed to get container status \"742e01d468a50eda9ffdb36eb413234d491fac8f8757606046bc0c1c7193257c\": rpc error: code = NotFound desc = could not find container \"742e01d468a50eda9ffdb36eb413234d491fac8f8757606046bc0c1c7193257c\": container with ID starting with 742e01d468a50eda9ffdb36eb413234d491fac8f8757606046bc0c1c7193257c not found: ID does not exist" Dec 02 10:22:23 crc kubenswrapper[4781]: I1202 10:22:23.079355 4781 scope.go:117] "RemoveContainer" containerID="73a44426c99220fc3654b7be3e01c8f59dda8097cb4adfe4c31c22cb0bb31dd8" Dec 02 10:22:23 crc kubenswrapper[4781]: E1202 10:22:23.079753 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73a44426c99220fc3654b7be3e01c8f59dda8097cb4adfe4c31c22cb0bb31dd8\": container with ID starting with 73a44426c99220fc3654b7be3e01c8f59dda8097cb4adfe4c31c22cb0bb31dd8 not found: ID does not exist" containerID="73a44426c99220fc3654b7be3e01c8f59dda8097cb4adfe4c31c22cb0bb31dd8" Dec 02 10:22:23 crc kubenswrapper[4781]: I1202 10:22:23.079781 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73a44426c99220fc3654b7be3e01c8f59dda8097cb4adfe4c31c22cb0bb31dd8"} err="failed to get container status \"73a44426c99220fc3654b7be3e01c8f59dda8097cb4adfe4c31c22cb0bb31dd8\": rpc error: code = NotFound desc = could not find container \"73a44426c99220fc3654b7be3e01c8f59dda8097cb4adfe4c31c22cb0bb31dd8\": container with ID starting with 73a44426c99220fc3654b7be3e01c8f59dda8097cb4adfe4c31c22cb0bb31dd8 not found: ID does not exist" Dec 02 10:22:23 crc kubenswrapper[4781]: I1202 10:22:23.519111 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79988a9b-34bb-4395-b2a4-748fa3a546cc" path="/var/lib/kubelet/pods/79988a9b-34bb-4395-b2a4-748fa3a546cc/volumes" Dec 02 10:22:24 crc kubenswrapper[4781]: I1202 10:22:24.313720 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zqz78" Dec 02 10:22:24 crc kubenswrapper[4781]: I1202 10:22:24.358007 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zqz78" Dec 02 10:22:25 crc kubenswrapper[4781]: I1202 10:22:25.349424 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zqz78"] Dec 02 10:22:25 crc kubenswrapper[4781]: I1202 10:22:25.976566 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zqz78" podUID="e32d887a-3e2b-4699-bfa0-d79379140ef5" containerName="registry-server" containerID="cri-o://cb835ab47244b8ecd9d3227a55c9fd24ca256914fff6b214aacd61cb90e68433" gracePeriod=2 Dec 02 10:22:26 crc kubenswrapper[4781]: I1202 10:22:26.484238 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zqz78" Dec 02 10:22:26 crc kubenswrapper[4781]: I1202 10:22:26.560724 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e32d887a-3e2b-4699-bfa0-d79379140ef5-utilities\") pod \"e32d887a-3e2b-4699-bfa0-d79379140ef5\" (UID: \"e32d887a-3e2b-4699-bfa0-d79379140ef5\") " Dec 02 10:22:26 crc kubenswrapper[4781]: I1202 10:22:26.560862 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8d5m\" (UniqueName: \"kubernetes.io/projected/e32d887a-3e2b-4699-bfa0-d79379140ef5-kube-api-access-k8d5m\") pod \"e32d887a-3e2b-4699-bfa0-d79379140ef5\" (UID: \"e32d887a-3e2b-4699-bfa0-d79379140ef5\") " Dec 02 10:22:26 crc kubenswrapper[4781]: I1202 10:22:26.560901 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e32d887a-3e2b-4699-bfa0-d79379140ef5-catalog-content\") pod \"e32d887a-3e2b-4699-bfa0-d79379140ef5\" (UID: \"e32d887a-3e2b-4699-bfa0-d79379140ef5\") " Dec 02 10:22:26 crc kubenswrapper[4781]: I1202 10:22:26.561590 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e32d887a-3e2b-4699-bfa0-d79379140ef5-utilities" (OuterVolumeSpecName: "utilities") pod "e32d887a-3e2b-4699-bfa0-d79379140ef5" (UID: "e32d887a-3e2b-4699-bfa0-d79379140ef5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:22:26 crc kubenswrapper[4781]: I1202 10:22:26.566391 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e32d887a-3e2b-4699-bfa0-d79379140ef5-kube-api-access-k8d5m" (OuterVolumeSpecName: "kube-api-access-k8d5m") pod "e32d887a-3e2b-4699-bfa0-d79379140ef5" (UID: "e32d887a-3e2b-4699-bfa0-d79379140ef5"). InnerVolumeSpecName "kube-api-access-k8d5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:22:26 crc kubenswrapper[4781]: I1202 10:22:26.664127 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8d5m\" (UniqueName: \"kubernetes.io/projected/e32d887a-3e2b-4699-bfa0-d79379140ef5-kube-api-access-k8d5m\") on node \"crc\" DevicePath \"\"" Dec 02 10:22:26 crc kubenswrapper[4781]: I1202 10:22:26.664189 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e32d887a-3e2b-4699-bfa0-d79379140ef5-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:22:26 crc kubenswrapper[4781]: I1202 10:22:26.689569 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e32d887a-3e2b-4699-bfa0-d79379140ef5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e32d887a-3e2b-4699-bfa0-d79379140ef5" (UID: "e32d887a-3e2b-4699-bfa0-d79379140ef5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:22:26 crc kubenswrapper[4781]: I1202 10:22:26.766098 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e32d887a-3e2b-4699-bfa0-d79379140ef5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:22:26 crc kubenswrapper[4781]: I1202 10:22:26.988478 4781 generic.go:334] "Generic (PLEG): container finished" podID="e32d887a-3e2b-4699-bfa0-d79379140ef5" containerID="cb835ab47244b8ecd9d3227a55c9fd24ca256914fff6b214aacd61cb90e68433" exitCode=0 Dec 02 10:22:26 crc kubenswrapper[4781]: I1202 10:22:26.988526 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zqz78" event={"ID":"e32d887a-3e2b-4699-bfa0-d79379140ef5","Type":"ContainerDied","Data":"cb835ab47244b8ecd9d3227a55c9fd24ca256914fff6b214aacd61cb90e68433"} Dec 02 10:22:26 crc kubenswrapper[4781]: I1202 10:22:26.988559 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zqz78" event={"ID":"e32d887a-3e2b-4699-bfa0-d79379140ef5","Type":"ContainerDied","Data":"852cb23c25885deb181822c6e7a5619d9dc01bc951a1f661bf74fc6e55a57b46"} Dec 02 10:22:26 crc kubenswrapper[4781]: I1202 10:22:26.988589 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zqz78" Dec 02 10:22:26 crc kubenswrapper[4781]: I1202 10:22:26.988595 4781 scope.go:117] "RemoveContainer" containerID="cb835ab47244b8ecd9d3227a55c9fd24ca256914fff6b214aacd61cb90e68433" Dec 02 10:22:27 crc kubenswrapper[4781]: I1202 10:22:27.018099 4781 scope.go:117] "RemoveContainer" containerID="7e3ebfddf0449d8c72d6519097f79d3b7de4df0b3cb1658b2e3450ee59e0df2c" Dec 02 10:22:27 crc kubenswrapper[4781]: I1202 10:22:27.037994 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zqz78"] Dec 02 10:22:27 crc kubenswrapper[4781]: I1202 10:22:27.054747 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zqz78"] Dec 02 10:22:27 crc kubenswrapper[4781]: I1202 10:22:27.077736 4781 scope.go:117] "RemoveContainer" containerID="bb8558e0e49347600ebe05971e66fb835e7f3c4e3cbd26cfbd5709e9323be6a2" Dec 02 10:22:27 crc kubenswrapper[4781]: I1202 10:22:27.100098 4781 scope.go:117] "RemoveContainer" containerID="cb835ab47244b8ecd9d3227a55c9fd24ca256914fff6b214aacd61cb90e68433" Dec 02 10:22:27 crc kubenswrapper[4781]: E1202 10:22:27.100702 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb835ab47244b8ecd9d3227a55c9fd24ca256914fff6b214aacd61cb90e68433\": container with ID starting with cb835ab47244b8ecd9d3227a55c9fd24ca256914fff6b214aacd61cb90e68433 not found: ID does not exist" containerID="cb835ab47244b8ecd9d3227a55c9fd24ca256914fff6b214aacd61cb90e68433" Dec 02 10:22:27 crc kubenswrapper[4781]: I1202 10:22:27.100751 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb835ab47244b8ecd9d3227a55c9fd24ca256914fff6b214aacd61cb90e68433"} err="failed to get container status \"cb835ab47244b8ecd9d3227a55c9fd24ca256914fff6b214aacd61cb90e68433\": rpc error: code = NotFound desc = could not find container \"cb835ab47244b8ecd9d3227a55c9fd24ca256914fff6b214aacd61cb90e68433\": container with ID starting with cb835ab47244b8ecd9d3227a55c9fd24ca256914fff6b214aacd61cb90e68433 not found: ID does not exist" Dec 02 10:22:27 crc kubenswrapper[4781]: I1202 10:22:27.100783 4781 scope.go:117] "RemoveContainer" containerID="7e3ebfddf0449d8c72d6519097f79d3b7de4df0b3cb1658b2e3450ee59e0df2c" Dec 02 10:22:27 crc kubenswrapper[4781]: E1202 10:22:27.101477 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e3ebfddf0449d8c72d6519097f79d3b7de4df0b3cb1658b2e3450ee59e0df2c\": container with ID starting with 7e3ebfddf0449d8c72d6519097f79d3b7de4df0b3cb1658b2e3450ee59e0df2c not found: ID does not exist" containerID="7e3ebfddf0449d8c72d6519097f79d3b7de4df0b3cb1658b2e3450ee59e0df2c" Dec 02 10:22:27 crc kubenswrapper[4781]: I1202 10:22:27.101516 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3ebfddf0449d8c72d6519097f79d3b7de4df0b3cb1658b2e3450ee59e0df2c"} err="failed to get container status \"7e3ebfddf0449d8c72d6519097f79d3b7de4df0b3cb1658b2e3450ee59e0df2c\": rpc error: code = NotFound desc = could not find container \"7e3ebfddf0449d8c72d6519097f79d3b7de4df0b3cb1658b2e3450ee59e0df2c\": container with ID starting with 7e3ebfddf0449d8c72d6519097f79d3b7de4df0b3cb1658b2e3450ee59e0df2c not found: ID does not exist" Dec 02 10:22:27 crc kubenswrapper[4781]: I1202 10:22:27.101540 4781 scope.go:117] "RemoveContainer" containerID="bb8558e0e49347600ebe05971e66fb835e7f3c4e3cbd26cfbd5709e9323be6a2" Dec 02 10:22:27 crc kubenswrapper[4781]: E1202 10:22:27.102097 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb8558e0e49347600ebe05971e66fb835e7f3c4e3cbd26cfbd5709e9323be6a2\": container with ID starting with bb8558e0e49347600ebe05971e66fb835e7f3c4e3cbd26cfbd5709e9323be6a2 not found: ID does not exist" containerID="bb8558e0e49347600ebe05971e66fb835e7f3c4e3cbd26cfbd5709e9323be6a2" Dec 02 10:22:27 crc kubenswrapper[4781]: I1202 10:22:27.102141 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb8558e0e49347600ebe05971e66fb835e7f3c4e3cbd26cfbd5709e9323be6a2"} err="failed to get container status \"bb8558e0e49347600ebe05971e66fb835e7f3c4e3cbd26cfbd5709e9323be6a2\": rpc error: code = NotFound desc = could not find container \"bb8558e0e49347600ebe05971e66fb835e7f3c4e3cbd26cfbd5709e9323be6a2\": container with ID starting with bb8558e0e49347600ebe05971e66fb835e7f3c4e3cbd26cfbd5709e9323be6a2 not found: ID does not exist" Dec 02 10:22:27 crc kubenswrapper[4781]: I1202 10:22:27.517034 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e32d887a-3e2b-4699-bfa0-d79379140ef5" path="/var/lib/kubelet/pods/e32d887a-3e2b-4699-bfa0-d79379140ef5/volumes" Dec 02 10:22:30 crc kubenswrapper[4781]: I1202 10:22:30.412390 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:22:30 crc kubenswrapper[4781]: I1202 10:22:30.412791 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:22:30 crc kubenswrapper[4781]: I1202 10:22:30.412833 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 10:22:30 crc kubenswrapper[4781]: I1202 10:22:30.413472 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388"} pod="openshift-machine-config-operator/machine-config-daemon-pzntm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 10:22:30 crc kubenswrapper[4781]: I1202 10:22:30.413520 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" containerID="cri-o://3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" gracePeriod=600 Dec 02 10:22:30 crc kubenswrapper[4781]: E1202 10:22:30.539667 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:22:31 crc kubenswrapper[4781]: I1202 10:22:31.053268 4781 generic.go:334] "Generic (PLEG): container finished" podID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" exitCode=0 Dec 02 10:22:31 crc kubenswrapper[4781]: I1202 10:22:31.053354 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerDied","Data":"3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388"} Dec 02 10:22:31 crc kubenswrapper[4781]: I1202 10:22:31.053606 4781 scope.go:117] "RemoveContainer" containerID="4e497409fcbb803030a10c1c523917978b588e2d6baad2c22809c97c56aec2e5" Dec 02 10:22:31 crc kubenswrapper[4781]: I1202 10:22:31.054426 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:22:31 crc kubenswrapper[4781]: E1202 10:22:31.055084 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:22:44 crc kubenswrapper[4781]: I1202 10:22:44.500105 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:22:44 crc kubenswrapper[4781]: E1202 10:22:44.500996 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:22:57 crc kubenswrapper[4781]: I1202 10:22:57.506597 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:22:57 crc kubenswrapper[4781]: E1202 10:22:57.507505 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:22:59 crc kubenswrapper[4781]: E1202 10:22:59.028444 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Dec 02 10:23:08 crc kubenswrapper[4781]: I1202 10:23:08.500437 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:23:08 crc kubenswrapper[4781]: E1202 10:23:08.501394 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:23:19 crc kubenswrapper[4781]: I1202 10:23:19.502580 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:23:19 crc kubenswrapper[4781]: E1202 10:23:19.503282 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:23:32 crc kubenswrapper[4781]: I1202 10:23:32.687047 4781 generic.go:334] "Generic (PLEG): container finished" podID="ac589c1e-71f7-423f-b99b-3ebf175b40f3" containerID="8d172fe6136e87003109bf87428976ab924eb793e112978cde99e3d6975e7f59" exitCode=0 Dec 02 10:23:32 crc kubenswrapper[4781]: I1202 10:23:32.687149 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ac589c1e-71f7-423f-b99b-3ebf175b40f3","Type":"ContainerDied","Data":"8d172fe6136e87003109bf87428976ab924eb793e112978cde99e3d6975e7f59"} Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.061079 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.227812 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac589c1e-71f7-423f-b99b-3ebf175b40f3-config-data\") pod \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.227891 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ac589c1e-71f7-423f-b99b-3ebf175b40f3-test-operator-ephemeral-workdir\") pod \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.227933 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.227949 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-ssh-key\") pod \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.227989 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-openstack-config-secret\") pod \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.228049 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ac589c1e-71f7-423f-b99b-3ebf175b40f3-openstack-config\") pod \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.228108 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-ca-certs\") pod \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.228153 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ac589c1e-71f7-423f-b99b-3ebf175b40f3-test-operator-ephemeral-temporary\") pod \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.228187 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kqsw\" (UniqueName: \"kubernetes.io/projected/ac589c1e-71f7-423f-b99b-3ebf175b40f3-kube-api-access-5kqsw\") pod \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\" (UID: \"ac589c1e-71f7-423f-b99b-3ebf175b40f3\") " Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.229892 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac589c1e-71f7-423f-b99b-3ebf175b40f3-config-data" (OuterVolumeSpecName: "config-data") pod "ac589c1e-71f7-423f-b99b-3ebf175b40f3" (UID: "ac589c1e-71f7-423f-b99b-3ebf175b40f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.232304 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac589c1e-71f7-423f-b99b-3ebf175b40f3-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "ac589c1e-71f7-423f-b99b-3ebf175b40f3" (UID: "ac589c1e-71f7-423f-b99b-3ebf175b40f3"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.234130 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "ac589c1e-71f7-423f-b99b-3ebf175b40f3" (UID: "ac589c1e-71f7-423f-b99b-3ebf175b40f3"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.234199 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac589c1e-71f7-423f-b99b-3ebf175b40f3-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "ac589c1e-71f7-423f-b99b-3ebf175b40f3" (UID: "ac589c1e-71f7-423f-b99b-3ebf175b40f3"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.237844 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac589c1e-71f7-423f-b99b-3ebf175b40f3-kube-api-access-5kqsw" (OuterVolumeSpecName: "kube-api-access-5kqsw") pod "ac589c1e-71f7-423f-b99b-3ebf175b40f3" (UID: "ac589c1e-71f7-423f-b99b-3ebf175b40f3"). InnerVolumeSpecName "kube-api-access-5kqsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.254484 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ac589c1e-71f7-423f-b99b-3ebf175b40f3" (UID: "ac589c1e-71f7-423f-b99b-3ebf175b40f3"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.255472 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ac589c1e-71f7-423f-b99b-3ebf175b40f3" (UID: "ac589c1e-71f7-423f-b99b-3ebf175b40f3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.258472 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "ac589c1e-71f7-423f-b99b-3ebf175b40f3" (UID: "ac589c1e-71f7-423f-b99b-3ebf175b40f3"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.280970 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac589c1e-71f7-423f-b99b-3ebf175b40f3-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ac589c1e-71f7-423f-b99b-3ebf175b40f3" (UID: "ac589c1e-71f7-423f-b99b-3ebf175b40f3"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.330103 4781 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ac589c1e-71f7-423f-b99b-3ebf175b40f3-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.330402 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kqsw\" (UniqueName: \"kubernetes.io/projected/ac589c1e-71f7-423f-b99b-3ebf175b40f3-kube-api-access-5kqsw\") on node \"crc\" DevicePath \"\"" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.330498 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac589c1e-71f7-423f-b99b-3ebf175b40f3-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.330582 4781 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ac589c1e-71f7-423f-b99b-3ebf175b40f3-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.330737 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.330848 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.330952 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.331041 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ac589c1e-71f7-423f-b99b-3ebf175b40f3-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.331125 4781 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ac589c1e-71f7-423f-b99b-3ebf175b40f3-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.363808 4781 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.433080 4781 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.499349 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:23:34 crc kubenswrapper[4781]: E1202 10:23:34.499719 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.706808 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ac589c1e-71f7-423f-b99b-3ebf175b40f3","Type":"ContainerDied","Data":"a363c6a8372cbb779cdbc71f0b292476a80cd24b1cd808e08f5e6adaf764366f"} Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.707200 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a363c6a8372cbb779cdbc71f0b292476a80cd24b1cd808e08f5e6adaf764366f" Dec 02 10:23:34 crc kubenswrapper[4781]: I1202 10:23:34.707084 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 10:23:44 crc kubenswrapper[4781]: I1202 10:23:44.839477 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 02 10:23:44 crc kubenswrapper[4781]: E1202 10:23:44.840579 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79988a9b-34bb-4395-b2a4-748fa3a546cc" containerName="extract-content" Dec 02 10:23:44 crc kubenswrapper[4781]: I1202 10:23:44.840597 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="79988a9b-34bb-4395-b2a4-748fa3a546cc" containerName="extract-content" Dec 02 10:23:44 crc kubenswrapper[4781]: E1202 10:23:44.840614 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e32d887a-3e2b-4699-bfa0-d79379140ef5" containerName="extract-utilities" Dec 02 10:23:44 crc kubenswrapper[4781]: I1202 10:23:44.840624 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e32d887a-3e2b-4699-bfa0-d79379140ef5" containerName="extract-utilities" Dec 02 10:23:44 crc kubenswrapper[4781]: E1202 10:23:44.840644 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac589c1e-71f7-423f-b99b-3ebf175b40f3" containerName="tempest-tests-tempest-tests-runner" Dec 02 10:23:44 crc kubenswrapper[4781]: I1202 10:23:44.840696 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac589c1e-71f7-423f-b99b-3ebf175b40f3" containerName="tempest-tests-tempest-tests-runner" Dec 02 10:23:44 crc kubenswrapper[4781]: E1202 10:23:44.840715 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79988a9b-34bb-4395-b2a4-748fa3a546cc" containerName="registry-server" Dec 02 10:23:44 crc kubenswrapper[4781]: I1202 10:23:44.840723 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="79988a9b-34bb-4395-b2a4-748fa3a546cc" containerName="registry-server" Dec 02 10:23:44 crc kubenswrapper[4781]: E1202 10:23:44.840748 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79988a9b-34bb-4395-b2a4-748fa3a546cc" containerName="extract-utilities" Dec 02 10:23:44 crc kubenswrapper[4781]: I1202 10:23:44.840756 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="79988a9b-34bb-4395-b2a4-748fa3a546cc" containerName="extract-utilities" Dec 02 10:23:44 crc kubenswrapper[4781]: E1202 10:23:44.840778 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e32d887a-3e2b-4699-bfa0-d79379140ef5" containerName="registry-server" Dec 02 10:23:44 crc kubenswrapper[4781]: I1202 10:23:44.840785 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e32d887a-3e2b-4699-bfa0-d79379140ef5" containerName="registry-server" Dec 02 10:23:44 crc kubenswrapper[4781]: E1202 10:23:44.840805 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e32d887a-3e2b-4699-bfa0-d79379140ef5" containerName="extract-content" Dec 02 10:23:44 crc kubenswrapper[4781]: I1202 10:23:44.840814 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e32d887a-3e2b-4699-bfa0-d79379140ef5" containerName="extract-content" Dec 02 10:23:44 crc kubenswrapper[4781]: I1202 10:23:44.841067 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="79988a9b-34bb-4395-b2a4-748fa3a546cc" containerName="registry-server" Dec 02 10:23:44 crc kubenswrapper[4781]: I1202 10:23:44.841080 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac589c1e-71f7-423f-b99b-3ebf175b40f3" containerName="tempest-tests-tempest-tests-runner" Dec 02 10:23:44 crc kubenswrapper[4781]: I1202 10:23:44.841094 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e32d887a-3e2b-4699-bfa0-d79379140ef5" containerName="registry-server" Dec 02 10:23:44 crc kubenswrapper[4781]: I1202 10:23:44.841833 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 10:23:44 crc kubenswrapper[4781]: I1202 10:23:44.843914 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-zm7cr" Dec 02 10:23:44 crc kubenswrapper[4781]: I1202 10:23:44.855007 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 02 10:23:45 crc kubenswrapper[4781]: I1202 10:23:45.023686 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"45f4fae7-b5a7-4c3c-845e-4f2daed0a787\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 10:23:45 crc kubenswrapper[4781]: I1202 10:23:45.024392 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcq2c\" (UniqueName: \"kubernetes.io/projected/45f4fae7-b5a7-4c3c-845e-4f2daed0a787-kube-api-access-xcq2c\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"45f4fae7-b5a7-4c3c-845e-4f2daed0a787\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 10:23:45 crc kubenswrapper[4781]: I1202 10:23:45.127133 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"45f4fae7-b5a7-4c3c-845e-4f2daed0a787\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 10:23:45 crc kubenswrapper[4781]: I1202 10:23:45.127209 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcq2c\" (UniqueName: \"kubernetes.io/projected/45f4fae7-b5a7-4c3c-845e-4f2daed0a787-kube-api-access-xcq2c\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"45f4fae7-b5a7-4c3c-845e-4f2daed0a787\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 10:23:45 crc kubenswrapper[4781]: I1202 10:23:45.127860 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"45f4fae7-b5a7-4c3c-845e-4f2daed0a787\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 10:23:45 crc kubenswrapper[4781]: I1202 10:23:45.153973 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcq2c\" (UniqueName: \"kubernetes.io/projected/45f4fae7-b5a7-4c3c-845e-4f2daed0a787-kube-api-access-xcq2c\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"45f4fae7-b5a7-4c3c-845e-4f2daed0a787\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 10:23:45 crc kubenswrapper[4781]: I1202 10:23:45.157731 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"45f4fae7-b5a7-4c3c-845e-4f2daed0a787\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 10:23:45 crc kubenswrapper[4781]: I1202 10:23:45.167173 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 10:23:45 crc kubenswrapper[4781]: I1202 10:23:45.669207 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 02 10:23:45 crc kubenswrapper[4781]: I1202 10:23:45.805314 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"45f4fae7-b5a7-4c3c-845e-4f2daed0a787","Type":"ContainerStarted","Data":"ebf6fa8188a4510ca5c0497d9249d8389cb8f4278ca9208feaf166da3f64855e"} Dec 02 10:23:47 crc kubenswrapper[4781]: I1202 10:23:47.838597 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"45f4fae7-b5a7-4c3c-845e-4f2daed0a787","Type":"ContainerStarted","Data":"7341a3001de04cc873fb97e8bb55a894b6c1d8c5492bb8bf36afc9a07fe0076b"} Dec 02 10:23:47 crc kubenswrapper[4781]: I1202 10:23:47.865334 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.893172173 podStartE2EDuration="3.865316783s" podCreationTimestamp="2025-12-02 10:23:44 +0000 UTC" firstStartedPulling="2025-12-02 10:23:45.676338719 +0000 UTC m=+3788.500212608" lastFinishedPulling="2025-12-02 10:23:46.648483329 +0000 UTC m=+3789.472357218" observedRunningTime="2025-12-02 10:23:47.858369236 +0000 UTC m=+3790.682243135" watchObservedRunningTime="2025-12-02 10:23:47.865316783 +0000 UTC m=+3790.689190662" Dec 02 10:23:49 crc kubenswrapper[4781]: I1202 10:23:49.500358 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:23:49 crc kubenswrapper[4781]: E1202 10:23:49.501098 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:24:02 crc kubenswrapper[4781]: I1202 10:24:02.499844 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:24:02 crc kubenswrapper[4781]: E1202 10:24:02.501016 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:24:10 crc kubenswrapper[4781]: I1202 10:24:10.513735 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n5ldq/must-gather-m5gsv"] Dec 02 10:24:10 crc kubenswrapper[4781]: I1202 10:24:10.516173 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5ldq/must-gather-m5gsv" Dec 02 10:24:10 crc kubenswrapper[4781]: I1202 10:24:10.518022 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n5ldq"/"kube-root-ca.crt" Dec 02 10:24:10 crc kubenswrapper[4781]: I1202 10:24:10.518292 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-n5ldq"/"default-dockercfg-nw9vk" Dec 02 10:24:10 crc kubenswrapper[4781]: I1202 10:24:10.518528 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n5ldq"/"openshift-service-ca.crt" Dec 02 10:24:10 crc kubenswrapper[4781]: I1202 10:24:10.522168 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n5ldq/must-gather-m5gsv"] Dec 02 10:24:10 crc kubenswrapper[4781]: I1202 10:24:10.660439 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz5g9\" (UniqueName: \"kubernetes.io/projected/f89edb07-26fd-490c-889b-8612f8d4ed68-kube-api-access-bz5g9\") pod \"must-gather-m5gsv\" (UID: \"f89edb07-26fd-490c-889b-8612f8d4ed68\") " pod="openshift-must-gather-n5ldq/must-gather-m5gsv" Dec 02 10:24:10 crc kubenswrapper[4781]: I1202 10:24:10.660516 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f89edb07-26fd-490c-889b-8612f8d4ed68-must-gather-output\") pod \"must-gather-m5gsv\" (UID: \"f89edb07-26fd-490c-889b-8612f8d4ed68\") " pod="openshift-must-gather-n5ldq/must-gather-m5gsv" Dec 02 10:24:10 crc kubenswrapper[4781]: I1202 10:24:10.762718 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz5g9\" (UniqueName: \"kubernetes.io/projected/f89edb07-26fd-490c-889b-8612f8d4ed68-kube-api-access-bz5g9\") pod \"must-gather-m5gsv\" (UID: \"f89edb07-26fd-490c-889b-8612f8d4ed68\") " pod="openshift-must-gather-n5ldq/must-gather-m5gsv" Dec 02 10:24:10 crc kubenswrapper[4781]: I1202 10:24:10.763049 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f89edb07-26fd-490c-889b-8612f8d4ed68-must-gather-output\") pod \"must-gather-m5gsv\" (UID: \"f89edb07-26fd-490c-889b-8612f8d4ed68\") " pod="openshift-must-gather-n5ldq/must-gather-m5gsv" Dec 02 10:24:10 crc kubenswrapper[4781]: I1202 10:24:10.763457 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f89edb07-26fd-490c-889b-8612f8d4ed68-must-gather-output\") pod \"must-gather-m5gsv\" (UID: \"f89edb07-26fd-490c-889b-8612f8d4ed68\") " pod="openshift-must-gather-n5ldq/must-gather-m5gsv" Dec 02 10:24:10 crc kubenswrapper[4781]: I1202 10:24:10.799323 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz5g9\" (UniqueName: \"kubernetes.io/projected/f89edb07-26fd-490c-889b-8612f8d4ed68-kube-api-access-bz5g9\") pod \"must-gather-m5gsv\" (UID: \"f89edb07-26fd-490c-889b-8612f8d4ed68\") " pod="openshift-must-gather-n5ldq/must-gather-m5gsv" Dec 02 10:24:10 crc kubenswrapper[4781]: I1202 10:24:10.841333 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5ldq/must-gather-m5gsv" Dec 02 10:24:11 crc kubenswrapper[4781]: I1202 10:24:11.272162 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n5ldq/must-gather-m5gsv"] Dec 02 10:24:12 crc kubenswrapper[4781]: I1202 10:24:12.083649 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5ldq/must-gather-m5gsv" event={"ID":"f89edb07-26fd-490c-889b-8612f8d4ed68","Type":"ContainerStarted","Data":"9a77d2d22a023fc36c927560866fddca20345dc32d53ac427ce06fdb31ece9db"} Dec 02 10:24:15 crc kubenswrapper[4781]: I1202 10:24:15.500059 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:24:15 crc kubenswrapper[4781]: E1202 10:24:15.500791 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:24:16 crc kubenswrapper[4781]: I1202 10:24:16.138091 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5ldq/must-gather-m5gsv" event={"ID":"f89edb07-26fd-490c-889b-8612f8d4ed68","Type":"ContainerStarted","Data":"c1c90d5e0a938a9becaf86a86ed99d115a2d9867fe14ddcc6a6bbca5c2b98bc7"} Dec 02 10:24:16 crc kubenswrapper[4781]: I1202 10:24:16.138829 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5ldq/must-gather-m5gsv" event={"ID":"f89edb07-26fd-490c-889b-8612f8d4ed68","Type":"ContainerStarted","Data":"29006a3908e026567d81359f3a6b4653f56a2b36a7ce76e7225ec1fda35bbcc8"} Dec 02 10:24:16 crc kubenswrapper[4781]: I1202 10:24:16.159385 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n5ldq/must-gather-m5gsv" podStartSLOduration=2.431882379 podStartE2EDuration="6.159366212s" podCreationTimestamp="2025-12-02 10:24:10 +0000 UTC" firstStartedPulling="2025-12-02 10:24:11.267033754 +0000 UTC m=+3814.090907643" lastFinishedPulling="2025-12-02 10:24:14.994517597 +0000 UTC m=+3817.818391476" observedRunningTime="2025-12-02 10:24:16.152826256 +0000 UTC m=+3818.976700155" watchObservedRunningTime="2025-12-02 10:24:16.159366212 +0000 UTC m=+3818.983240091" Dec 02 10:24:18 crc kubenswrapper[4781]: E1202 10:24:18.707114 4781 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.194:34558->38.102.83.194:45901: write tcp 38.102.83.194:34558->38.102.83.194:45901: write: broken pipe Dec 02 10:24:19 crc kubenswrapper[4781]: I1202 10:24:19.134555 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n5ldq/crc-debug-zzn76"] Dec 02 10:24:19 crc kubenswrapper[4781]: I1202 10:24:19.135817 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5ldq/crc-debug-zzn76" Dec 02 10:24:19 crc kubenswrapper[4781]: I1202 10:24:19.316158 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjqh5\" (UniqueName: \"kubernetes.io/projected/f42bd2c8-6f92-462a-831c-7a55e2aee8ee-kube-api-access-xjqh5\") pod \"crc-debug-zzn76\" (UID: \"f42bd2c8-6f92-462a-831c-7a55e2aee8ee\") " pod="openshift-must-gather-n5ldq/crc-debug-zzn76" Dec 02 10:24:19 crc kubenswrapper[4781]: I1202 10:24:19.316322 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f42bd2c8-6f92-462a-831c-7a55e2aee8ee-host\") pod \"crc-debug-zzn76\" (UID: \"f42bd2c8-6f92-462a-831c-7a55e2aee8ee\") " pod="openshift-must-gather-n5ldq/crc-debug-zzn76" Dec 02 10:24:19 crc kubenswrapper[4781]: I1202 10:24:19.418048 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f42bd2c8-6f92-462a-831c-7a55e2aee8ee-host\") pod \"crc-debug-zzn76\" (UID: \"f42bd2c8-6f92-462a-831c-7a55e2aee8ee\") " pod="openshift-must-gather-n5ldq/crc-debug-zzn76" Dec 02 10:24:19 crc kubenswrapper[4781]: I1202 10:24:19.418161 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjqh5\" (UniqueName: \"kubernetes.io/projected/f42bd2c8-6f92-462a-831c-7a55e2aee8ee-kube-api-access-xjqh5\") pod \"crc-debug-zzn76\" (UID: \"f42bd2c8-6f92-462a-831c-7a55e2aee8ee\") " pod="openshift-must-gather-n5ldq/crc-debug-zzn76" Dec 02 10:24:19 crc kubenswrapper[4781]: I1202 10:24:19.418222 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f42bd2c8-6f92-462a-831c-7a55e2aee8ee-host\") pod \"crc-debug-zzn76\" (UID: \"f42bd2c8-6f92-462a-831c-7a55e2aee8ee\") " pod="openshift-must-gather-n5ldq/crc-debug-zzn76" Dec 02 10:24:19 crc kubenswrapper[4781]: I1202 10:24:19.449475 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjqh5\" (UniqueName: \"kubernetes.io/projected/f42bd2c8-6f92-462a-831c-7a55e2aee8ee-kube-api-access-xjqh5\") pod \"crc-debug-zzn76\" (UID: \"f42bd2c8-6f92-462a-831c-7a55e2aee8ee\") " pod="openshift-must-gather-n5ldq/crc-debug-zzn76" Dec 02 10:24:19 crc kubenswrapper[4781]: I1202 10:24:19.456726 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5ldq/crc-debug-zzn76" Dec 02 10:24:20 crc kubenswrapper[4781]: I1202 10:24:20.175144 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5ldq/crc-debug-zzn76" event={"ID":"f42bd2c8-6f92-462a-831c-7a55e2aee8ee","Type":"ContainerStarted","Data":"a68743bf4c186c98baa036dfde05d42a3aaf728dc24e4e321333acffde95c428"} Dec 02 10:24:29 crc kubenswrapper[4781]: I1202 10:24:29.499854 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:24:29 crc kubenswrapper[4781]: E1202 10:24:29.500656 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:24:31 crc kubenswrapper[4781]: I1202 10:24:31.303589 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5ldq/crc-debug-zzn76" event={"ID":"f42bd2c8-6f92-462a-831c-7a55e2aee8ee","Type":"ContainerStarted","Data":"48e33f1a84dc6ab5bd055c0a23da99040bff417919880b46851280f6d42b4395"} Dec 02 10:24:31 crc kubenswrapper[4781]: I1202 10:24:31.327209 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n5ldq/crc-debug-zzn76" podStartSLOduration=1.398930285 podStartE2EDuration="12.327191492s" podCreationTimestamp="2025-12-02 10:24:19 +0000 UTC" firstStartedPulling="2025-12-02 10:24:19.500257722 +0000 UTC m=+3822.324131601" lastFinishedPulling="2025-12-02 10:24:30.428518929 +0000 UTC m=+3833.252392808" observedRunningTime="2025-12-02 10:24:31.323258146 +0000 UTC m=+3834.147132025" watchObservedRunningTime="2025-12-02 10:24:31.327191492 +0000 UTC m=+3834.151065371" Dec 02 10:24:40 crc kubenswrapper[4781]: I1202 10:24:40.500915 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:24:40 crc kubenswrapper[4781]: E1202 10:24:40.501629 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:24:52 crc kubenswrapper[4781]: I1202 10:24:52.499795 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:24:52 crc kubenswrapper[4781]: E1202 10:24:52.500583 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:25:04 crc kubenswrapper[4781]: I1202 10:25:04.501050 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:25:04 crc kubenswrapper[4781]: E1202 10:25:04.503784 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:25:08 crc kubenswrapper[4781]: I1202 10:25:08.626069 4781 generic.go:334] "Generic (PLEG): container finished" podID="f42bd2c8-6f92-462a-831c-7a55e2aee8ee" containerID="48e33f1a84dc6ab5bd055c0a23da99040bff417919880b46851280f6d42b4395" exitCode=0 Dec 02 10:25:08 crc kubenswrapper[4781]: I1202 10:25:08.627247 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5ldq/crc-debug-zzn76" event={"ID":"f42bd2c8-6f92-462a-831c-7a55e2aee8ee","Type":"ContainerDied","Data":"48e33f1a84dc6ab5bd055c0a23da99040bff417919880b46851280f6d42b4395"} Dec 02 10:25:09 crc kubenswrapper[4781]: I1202 10:25:09.745187 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5ldq/crc-debug-zzn76" Dec 02 10:25:09 crc kubenswrapper[4781]: I1202 10:25:09.777004 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n5ldq/crc-debug-zzn76"] Dec 02 10:25:09 crc kubenswrapper[4781]: I1202 10:25:09.786436 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n5ldq/crc-debug-zzn76"] Dec 02 10:25:09 crc kubenswrapper[4781]: I1202 10:25:09.859813 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f42bd2c8-6f92-462a-831c-7a55e2aee8ee-host\") pod \"f42bd2c8-6f92-462a-831c-7a55e2aee8ee\" (UID: \"f42bd2c8-6f92-462a-831c-7a55e2aee8ee\") " Dec 02 10:25:09 crc kubenswrapper[4781]: I1202 10:25:09.859906 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f42bd2c8-6f92-462a-831c-7a55e2aee8ee-host" (OuterVolumeSpecName: "host") pod "f42bd2c8-6f92-462a-831c-7a55e2aee8ee" (UID: "f42bd2c8-6f92-462a-831c-7a55e2aee8ee"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 10:25:09 crc kubenswrapper[4781]: I1202 10:25:09.860016 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjqh5\" (UniqueName: \"kubernetes.io/projected/f42bd2c8-6f92-462a-831c-7a55e2aee8ee-kube-api-access-xjqh5\") pod \"f42bd2c8-6f92-462a-831c-7a55e2aee8ee\" (UID: \"f42bd2c8-6f92-462a-831c-7a55e2aee8ee\") " Dec 02 10:25:09 crc kubenswrapper[4781]: I1202 10:25:09.861335 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f42bd2c8-6f92-462a-831c-7a55e2aee8ee-host\") on node \"crc\" DevicePath \"\"" Dec 02 10:25:09 crc kubenswrapper[4781]: I1202 10:25:09.866681 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f42bd2c8-6f92-462a-831c-7a55e2aee8ee-kube-api-access-xjqh5" (OuterVolumeSpecName: "kube-api-access-xjqh5") pod "f42bd2c8-6f92-462a-831c-7a55e2aee8ee" (UID: "f42bd2c8-6f92-462a-831c-7a55e2aee8ee"). InnerVolumeSpecName "kube-api-access-xjqh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:25:09 crc kubenswrapper[4781]: I1202 10:25:09.963262 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjqh5\" (UniqueName: \"kubernetes.io/projected/f42bd2c8-6f92-462a-831c-7a55e2aee8ee-kube-api-access-xjqh5\") on node \"crc\" DevicePath \"\"" Dec 02 10:25:10 crc kubenswrapper[4781]: I1202 10:25:10.644589 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a68743bf4c186c98baa036dfde05d42a3aaf728dc24e4e321333acffde95c428" Dec 02 10:25:10 crc kubenswrapper[4781]: I1202 10:25:10.644696 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5ldq/crc-debug-zzn76" Dec 02 10:25:10 crc kubenswrapper[4781]: I1202 10:25:10.997483 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n5ldq/crc-debug-7ctvh"] Dec 02 10:25:10 crc kubenswrapper[4781]: E1202 10:25:10.998187 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f42bd2c8-6f92-462a-831c-7a55e2aee8ee" containerName="container-00" Dec 02 10:25:10 crc kubenswrapper[4781]: I1202 10:25:10.998220 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42bd2c8-6f92-462a-831c-7a55e2aee8ee" containerName="container-00" Dec 02 10:25:10 crc kubenswrapper[4781]: I1202 10:25:10.998516 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f42bd2c8-6f92-462a-831c-7a55e2aee8ee" containerName="container-00" Dec 02 10:25:10 crc kubenswrapper[4781]: I1202 10:25:10.999359 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5ldq/crc-debug-7ctvh" Dec 02 10:25:11 crc kubenswrapper[4781]: I1202 10:25:11.082126 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfxf7\" (UniqueName: \"kubernetes.io/projected/0a59fb00-6459-4d15-8a83-25d1c535d0e5-kube-api-access-hfxf7\") pod \"crc-debug-7ctvh\" (UID: \"0a59fb00-6459-4d15-8a83-25d1c535d0e5\") " pod="openshift-must-gather-n5ldq/crc-debug-7ctvh" Dec 02 10:25:11 crc kubenswrapper[4781]: I1202 10:25:11.082198 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a59fb00-6459-4d15-8a83-25d1c535d0e5-host\") pod \"crc-debug-7ctvh\" (UID: \"0a59fb00-6459-4d15-8a83-25d1c535d0e5\") " pod="openshift-must-gather-n5ldq/crc-debug-7ctvh" Dec 02 10:25:11 crc kubenswrapper[4781]: I1202 10:25:11.184540 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfxf7\" (UniqueName: \"kubernetes.io/projected/0a59fb00-6459-4d15-8a83-25d1c535d0e5-kube-api-access-hfxf7\") pod \"crc-debug-7ctvh\" (UID: \"0a59fb00-6459-4d15-8a83-25d1c535d0e5\") " pod="openshift-must-gather-n5ldq/crc-debug-7ctvh" Dec 02 10:25:11 crc kubenswrapper[4781]: I1202 10:25:11.184587 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a59fb00-6459-4d15-8a83-25d1c535d0e5-host\") pod \"crc-debug-7ctvh\" (UID: \"0a59fb00-6459-4d15-8a83-25d1c535d0e5\") " pod="openshift-must-gather-n5ldq/crc-debug-7ctvh" Dec 02 10:25:11 crc kubenswrapper[4781]: I1202 10:25:11.184654 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a59fb00-6459-4d15-8a83-25d1c535d0e5-host\") pod \"crc-debug-7ctvh\" (UID: \"0a59fb00-6459-4d15-8a83-25d1c535d0e5\") " pod="openshift-must-gather-n5ldq/crc-debug-7ctvh" Dec 02 10:25:11 crc kubenswrapper[4781]: I1202 10:25:11.206515 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfxf7\" (UniqueName: \"kubernetes.io/projected/0a59fb00-6459-4d15-8a83-25d1c535d0e5-kube-api-access-hfxf7\") pod \"crc-debug-7ctvh\" (UID: \"0a59fb00-6459-4d15-8a83-25d1c535d0e5\") " pod="openshift-must-gather-n5ldq/crc-debug-7ctvh" Dec 02 10:25:11 crc kubenswrapper[4781]: I1202 10:25:11.317486 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5ldq/crc-debug-7ctvh" Dec 02 10:25:11 crc kubenswrapper[4781]: I1202 10:25:11.513025 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f42bd2c8-6f92-462a-831c-7a55e2aee8ee" path="/var/lib/kubelet/pods/f42bd2c8-6f92-462a-831c-7a55e2aee8ee/volumes" Dec 02 10:25:11 crc kubenswrapper[4781]: I1202 10:25:11.653452 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5ldq/crc-debug-7ctvh" event={"ID":"0a59fb00-6459-4d15-8a83-25d1c535d0e5","Type":"ContainerStarted","Data":"04e82ab4da5d7debaaa786be9b3fecb66ff1e0bd209fcdb4ad3a26029d205fc5"} Dec 02 10:25:12 crc kubenswrapper[4781]: I1202 10:25:12.665306 4781 generic.go:334] "Generic (PLEG): container finished" podID="0a59fb00-6459-4d15-8a83-25d1c535d0e5" containerID="fe64462a9cfdf013a419118b30b9c144226bf54aed424c34d83b58309ebe15ce" exitCode=0 Dec 02 10:25:12 crc kubenswrapper[4781]: I1202 10:25:12.665355 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5ldq/crc-debug-7ctvh" event={"ID":"0a59fb00-6459-4d15-8a83-25d1c535d0e5","Type":"ContainerDied","Data":"fe64462a9cfdf013a419118b30b9c144226bf54aed424c34d83b58309ebe15ce"} Dec 02 10:25:13 crc kubenswrapper[4781]: I1202 10:25:13.119169 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n5ldq/crc-debug-7ctvh"] Dec 02 10:25:13 crc kubenswrapper[4781]: I1202 10:25:13.127475 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n5ldq/crc-debug-7ctvh"] Dec 02 10:25:13 crc kubenswrapper[4781]: I1202 10:25:13.796964 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5ldq/crc-debug-7ctvh" Dec 02 10:25:13 crc kubenswrapper[4781]: I1202 10:25:13.935191 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfxf7\" (UniqueName: \"kubernetes.io/projected/0a59fb00-6459-4d15-8a83-25d1c535d0e5-kube-api-access-hfxf7\") pod \"0a59fb00-6459-4d15-8a83-25d1c535d0e5\" (UID: \"0a59fb00-6459-4d15-8a83-25d1c535d0e5\") " Dec 02 10:25:13 crc kubenswrapper[4781]: I1202 10:25:13.935759 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a59fb00-6459-4d15-8a83-25d1c535d0e5-host\") pod \"0a59fb00-6459-4d15-8a83-25d1c535d0e5\" (UID: \"0a59fb00-6459-4d15-8a83-25d1c535d0e5\") " Dec 02 10:25:13 crc kubenswrapper[4781]: I1202 10:25:13.935869 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a59fb00-6459-4d15-8a83-25d1c535d0e5-host" (OuterVolumeSpecName: "host") pod "0a59fb00-6459-4d15-8a83-25d1c535d0e5" (UID: "0a59fb00-6459-4d15-8a83-25d1c535d0e5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 10:25:13 crc kubenswrapper[4781]: I1202 10:25:13.936453 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a59fb00-6459-4d15-8a83-25d1c535d0e5-host\") on node \"crc\" DevicePath \"\"" Dec 02 10:25:13 crc kubenswrapper[4781]: I1202 10:25:13.946303 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a59fb00-6459-4d15-8a83-25d1c535d0e5-kube-api-access-hfxf7" (OuterVolumeSpecName: "kube-api-access-hfxf7") pod "0a59fb00-6459-4d15-8a83-25d1c535d0e5" (UID: "0a59fb00-6459-4d15-8a83-25d1c535d0e5"). InnerVolumeSpecName "kube-api-access-hfxf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:25:14 crc kubenswrapper[4781]: I1202 10:25:14.038852 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfxf7\" (UniqueName: \"kubernetes.io/projected/0a59fb00-6459-4d15-8a83-25d1c535d0e5-kube-api-access-hfxf7\") on node \"crc\" DevicePath \"\"" Dec 02 10:25:14 crc kubenswrapper[4781]: I1202 10:25:14.325959 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n5ldq/crc-debug-pdh44"] Dec 02 10:25:14 crc kubenswrapper[4781]: E1202 10:25:14.327226 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a59fb00-6459-4d15-8a83-25d1c535d0e5" containerName="container-00" Dec 02 10:25:14 crc kubenswrapper[4781]: I1202 10:25:14.327270 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a59fb00-6459-4d15-8a83-25d1c535d0e5" containerName="container-00" Dec 02 10:25:14 crc kubenswrapper[4781]: I1202 10:25:14.327801 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a59fb00-6459-4d15-8a83-25d1c535d0e5" containerName="container-00" Dec 02 10:25:14 crc kubenswrapper[4781]: I1202 10:25:14.329180 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5ldq/crc-debug-pdh44" Dec 02 10:25:14 crc kubenswrapper[4781]: I1202 10:25:14.348390 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bbc75166-d9b3-4162-9511-b8df894bb552-host\") pod \"crc-debug-pdh44\" (UID: \"bbc75166-d9b3-4162-9511-b8df894bb552\") " pod="openshift-must-gather-n5ldq/crc-debug-pdh44" Dec 02 10:25:14 crc kubenswrapper[4781]: I1202 10:25:14.348641 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snsvg\" (UniqueName: \"kubernetes.io/projected/bbc75166-d9b3-4162-9511-b8df894bb552-kube-api-access-snsvg\") pod \"crc-debug-pdh44\" (UID: \"bbc75166-d9b3-4162-9511-b8df894bb552\") " pod="openshift-must-gather-n5ldq/crc-debug-pdh44" Dec 02 10:25:14 crc kubenswrapper[4781]: I1202 10:25:14.451016 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bbc75166-d9b3-4162-9511-b8df894bb552-host\") pod \"crc-debug-pdh44\" (UID: \"bbc75166-d9b3-4162-9511-b8df894bb552\") " pod="openshift-must-gather-n5ldq/crc-debug-pdh44" Dec 02 10:25:14 crc kubenswrapper[4781]: I1202 10:25:14.451154 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bbc75166-d9b3-4162-9511-b8df894bb552-host\") pod \"crc-debug-pdh44\" (UID: \"bbc75166-d9b3-4162-9511-b8df894bb552\") " pod="openshift-must-gather-n5ldq/crc-debug-pdh44" Dec 02 10:25:14 crc kubenswrapper[4781]: I1202 10:25:14.451171 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snsvg\" (UniqueName: \"kubernetes.io/projected/bbc75166-d9b3-4162-9511-b8df894bb552-kube-api-access-snsvg\") pod \"crc-debug-pdh44\" (UID: \"bbc75166-d9b3-4162-9511-b8df894bb552\") " pod="openshift-must-gather-n5ldq/crc-debug-pdh44" Dec 02 10:25:14 crc kubenswrapper[4781]: I1202 10:25:14.480462 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snsvg\" (UniqueName: \"kubernetes.io/projected/bbc75166-d9b3-4162-9511-b8df894bb552-kube-api-access-snsvg\") pod \"crc-debug-pdh44\" (UID: \"bbc75166-d9b3-4162-9511-b8df894bb552\") " pod="openshift-must-gather-n5ldq/crc-debug-pdh44" Dec 02 10:25:14 crc kubenswrapper[4781]: I1202 10:25:14.654641 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5ldq/crc-debug-pdh44" Dec 02 10:25:14 crc kubenswrapper[4781]: I1202 10:25:14.684267 4781 scope.go:117] "RemoveContainer" containerID="fe64462a9cfdf013a419118b30b9c144226bf54aed424c34d83b58309ebe15ce" Dec 02 10:25:14 crc kubenswrapper[4781]: I1202 10:25:14.684499 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5ldq/crc-debug-7ctvh" Dec 02 10:25:15 crc kubenswrapper[4781]: I1202 10:25:15.499302 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:25:15 crc kubenswrapper[4781]: E1202 10:25:15.499874 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:25:15 crc kubenswrapper[4781]: I1202 10:25:15.509709 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a59fb00-6459-4d15-8a83-25d1c535d0e5" path="/var/lib/kubelet/pods/0a59fb00-6459-4d15-8a83-25d1c535d0e5/volumes" Dec 02 10:25:15 crc kubenswrapper[4781]: I1202 10:25:15.701586 4781 generic.go:334] "Generic (PLEG): container finished" podID="bbc75166-d9b3-4162-9511-b8df894bb552" containerID="e9968c9d67c15f086170dfe6ba2d3575fb7a4c9d654a202602e6375331371ff2" exitCode=0 Dec 02 10:25:15 crc kubenswrapper[4781]: I1202 10:25:15.701627 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5ldq/crc-debug-pdh44" event={"ID":"bbc75166-d9b3-4162-9511-b8df894bb552","Type":"ContainerDied","Data":"e9968c9d67c15f086170dfe6ba2d3575fb7a4c9d654a202602e6375331371ff2"} Dec 02 10:25:15 crc kubenswrapper[4781]: I1202 10:25:15.701650 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5ldq/crc-debug-pdh44" event={"ID":"bbc75166-d9b3-4162-9511-b8df894bb552","Type":"ContainerStarted","Data":"f29255a3405fdbd44013c0d7b67a2ed70bd345076974825ee1604ea2eb482fa3"} Dec 02 10:25:15 crc kubenswrapper[4781]: I1202 10:25:15.753549 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n5ldq/crc-debug-pdh44"] Dec 02 10:25:15 crc kubenswrapper[4781]: I1202 10:25:15.765053 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n5ldq/crc-debug-pdh44"] Dec 02 10:25:16 crc kubenswrapper[4781]: I1202 10:25:16.816449 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5ldq/crc-debug-pdh44" Dec 02 10:25:16 crc kubenswrapper[4781]: I1202 10:25:16.995385 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bbc75166-d9b3-4162-9511-b8df894bb552-host\") pod \"bbc75166-d9b3-4162-9511-b8df894bb552\" (UID: \"bbc75166-d9b3-4162-9511-b8df894bb552\") " Dec 02 10:25:16 crc kubenswrapper[4781]: I1202 10:25:16.995530 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bbc75166-d9b3-4162-9511-b8df894bb552-host" (OuterVolumeSpecName: "host") pod "bbc75166-d9b3-4162-9511-b8df894bb552" (UID: "bbc75166-d9b3-4162-9511-b8df894bb552"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 10:25:16 crc kubenswrapper[4781]: I1202 10:25:16.995687 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snsvg\" (UniqueName: \"kubernetes.io/projected/bbc75166-d9b3-4162-9511-b8df894bb552-kube-api-access-snsvg\") pod \"bbc75166-d9b3-4162-9511-b8df894bb552\" (UID: \"bbc75166-d9b3-4162-9511-b8df894bb552\") " Dec 02 10:25:16 crc kubenswrapper[4781]: I1202 10:25:16.996093 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bbc75166-d9b3-4162-9511-b8df894bb552-host\") on node \"crc\" DevicePath \"\"" Dec 02 10:25:17 crc kubenswrapper[4781]: I1202 10:25:17.002734 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbc75166-d9b3-4162-9511-b8df894bb552-kube-api-access-snsvg" (OuterVolumeSpecName: "kube-api-access-snsvg") pod "bbc75166-d9b3-4162-9511-b8df894bb552" (UID: "bbc75166-d9b3-4162-9511-b8df894bb552"). InnerVolumeSpecName "kube-api-access-snsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:25:17 crc kubenswrapper[4781]: I1202 10:25:17.097463 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snsvg\" (UniqueName: \"kubernetes.io/projected/bbc75166-d9b3-4162-9511-b8df894bb552-kube-api-access-snsvg\") on node \"crc\" DevicePath \"\"" Dec 02 10:25:17 crc kubenswrapper[4781]: I1202 10:25:17.510796 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbc75166-d9b3-4162-9511-b8df894bb552" path="/var/lib/kubelet/pods/bbc75166-d9b3-4162-9511-b8df894bb552/volumes" Dec 02 10:25:17 crc kubenswrapper[4781]: I1202 10:25:17.730067 4781 scope.go:117] "RemoveContainer" containerID="e9968c9d67c15f086170dfe6ba2d3575fb7a4c9d654a202602e6375331371ff2" Dec 02 10:25:17 crc kubenswrapper[4781]: I1202 10:25:17.730087 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5ldq/crc-debug-pdh44" Dec 02 10:25:28 crc kubenswrapper[4781]: I1202 10:25:28.499690 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:25:28 crc kubenswrapper[4781]: E1202 10:25:28.500684 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:25:31 crc kubenswrapper[4781]: I1202 10:25:31.261110 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-f677cfc78-nzcxt_17137e34-c042-4c3b-b11b-3e743e2a00b5/barbican-api/0.log" Dec 02 10:25:31 crc kubenswrapper[4781]: I1202 10:25:31.304843 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-f677cfc78-nzcxt_17137e34-c042-4c3b-b11b-3e743e2a00b5/barbican-api-log/0.log" Dec 02 10:25:31 crc kubenswrapper[4781]: I1202 10:25:31.442170 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d776c757d-qgkmw_f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1/barbican-keystone-listener/0.log" Dec 02 10:25:31 crc kubenswrapper[4781]: I1202 10:25:31.500695 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d776c757d-qgkmw_f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1/barbican-keystone-listener-log/0.log" Dec 02 10:25:31 crc kubenswrapper[4781]: I1202 10:25:31.653009 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65bbfbf867-4wxpf_bac97a41-a2f4-46ee-b48c-216aeee03abc/barbican-worker/0.log" Dec 02 10:25:31 crc kubenswrapper[4781]: I1202 10:25:31.654501 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65bbfbf867-4wxpf_bac97a41-a2f4-46ee-b48c-216aeee03abc/barbican-worker-log/0.log" Dec 02 10:25:31 crc kubenswrapper[4781]: I1202 10:25:31.862601 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bbf6f862-1ee5-4bc3-83e5-71c1f72d526c/ceilometer-central-agent/0.log" Dec 02 10:25:31 crc kubenswrapper[4781]: I1202 10:25:31.920635 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4_ceabe2f6-ee27-456b-9031-9ebc39e032eb/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:25:31 crc kubenswrapper[4781]: I1202 10:25:31.955275 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bbf6f862-1ee5-4bc3-83e5-71c1f72d526c/ceilometer-notification-agent/0.log" Dec 02 10:25:32 crc kubenswrapper[4781]: I1202 10:25:32.472069 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bbf6f862-1ee5-4bc3-83e5-71c1f72d526c/proxy-httpd/0.log" Dec 02 10:25:32 crc kubenswrapper[4781]: I1202 10:25:32.538333 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bbf6f862-1ee5-4bc3-83e5-71c1f72d526c/sg-core/0.log" Dec 02 10:25:32 crc kubenswrapper[4781]: I1202 10:25:32.573135 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4e2d3890-4772-476e-9850-fdb32111b87a/cinder-api/0.log" Dec 02 10:25:32 crc kubenswrapper[4781]: I1202 10:25:32.721730 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4e2d3890-4772-476e-9850-fdb32111b87a/cinder-api-log/0.log" Dec 02 10:25:32 crc kubenswrapper[4781]: I1202 10:25:32.836568 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8457fc90-04ad-45ed-b898-ddf4d7b645b4/cinder-scheduler/0.log" Dec 02 10:25:32 crc kubenswrapper[4781]: I1202 10:25:32.893616 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8457fc90-04ad-45ed-b898-ddf4d7b645b4/probe/0.log" Dec 02 10:25:32 crc kubenswrapper[4781]: I1202 10:25:32.966049 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-95445_321f18fc-759a-4eb7-bbb0-f230b7002932/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:25:33 crc kubenswrapper[4781]: I1202 10:25:33.112455 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z_0fcef48b-9cfd-4f26-9964-3a083b035119/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:25:33 crc kubenswrapper[4781]: I1202 10:25:33.222665 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-x5657_6d69f627-f361-4060-91c8-dc5763a00f16/init/0.log" Dec 02 10:25:33 crc kubenswrapper[4781]: I1202 10:25:33.400173 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-x5657_6d69f627-f361-4060-91c8-dc5763a00f16/init/0.log" Dec 02 10:25:33 crc kubenswrapper[4781]: I1202 10:25:33.425148 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-x5657_6d69f627-f361-4060-91c8-dc5763a00f16/dnsmasq-dns/0.log" Dec 02 10:25:33 crc kubenswrapper[4781]: I1202 10:25:33.469107 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf_8804a3f3-f23a-4a85-8e45-9f92f90c5e9b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:25:33 crc kubenswrapper[4781]: I1202 10:25:33.649925 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34/glance-httpd/0.log" Dec 02 10:25:33 crc kubenswrapper[4781]: I1202 10:25:33.714234 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34/glance-log/0.log" Dec 02 10:25:33 crc kubenswrapper[4781]: I1202 10:25:33.869340 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5e057772-e9bc-4fce-90c9-be91978362fe/glance-log/0.log" Dec 02 10:25:33 crc kubenswrapper[4781]: I1202 10:25:33.884307 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5e057772-e9bc-4fce-90c9-be91978362fe/glance-httpd/0.log" Dec 02 10:25:34 crc kubenswrapper[4781]: I1202 10:25:34.035572 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6856678494-4cprv_226317c7-a6f4-43c5-a3df-c9cb18b3afa5/horizon/0.log" Dec 02 10:25:34 crc kubenswrapper[4781]: I1202 10:25:34.236915 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j_82e18c11-6a85-45d3-8794-c7d7d02aaa2d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:25:34 crc kubenswrapper[4781]: I1202 10:25:34.322721 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6856678494-4cprv_226317c7-a6f4-43c5-a3df-c9cb18b3afa5/horizon-log/0.log" Dec 02 10:25:34 crc kubenswrapper[4781]: I1202 10:25:34.337061 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-2p8f8_ea0d0dc8-72c3-42de-92b5-a98ad0417f6d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:25:34 crc kubenswrapper[4781]: I1202 10:25:34.528102 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29411161-pl6b8_da3b7144-110c-45af-a358-804809a89670/keystone-cron/0.log" Dec 02 10:25:34 crc kubenswrapper[4781]: I1202 10:25:34.581044 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cbfc4ddfb-kljg5_e2f1c0db-2cf8-4e49-b1cf-8cb27f997927/keystone-api/0.log" Dec 02 10:25:34 crc kubenswrapper[4781]: I1202 10:25:34.671613 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_2546f353-d520-44ea-8040-c41223665f1f/kube-state-metrics/0.log" Dec 02 10:25:34 crc kubenswrapper[4781]: I1202 10:25:34.801279 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9_8db18e94-bcea-4e3a-8759-65fb8084cd43/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:25:35 crc kubenswrapper[4781]: I1202 10:25:35.065940 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-9cbdc4d89-pkh64_f074040a-1272-492e-b149-3a0a6cc89efd/neutron-httpd/0.log" Dec 02 10:25:35 crc kubenswrapper[4781]: I1202 10:25:35.127721 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-9cbdc4d89-pkh64_f074040a-1272-492e-b149-3a0a6cc89efd/neutron-api/0.log" Dec 02 10:25:35 crc kubenswrapper[4781]: I1202 10:25:35.131546 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd_c2171979-7791-4850-a4cf-99ac7e62d054/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:25:35 crc kubenswrapper[4781]: I1202 10:25:35.699369 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_18fa078a-6d45-40cf-a39e-139d84f86f76/nova-api-log/0.log" Dec 02 10:25:35 crc kubenswrapper[4781]: I1202 10:25:35.701730 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_56b3a068-08e6-4567-b09b-4050ad8f1a65/nova-cell0-conductor-conductor/0.log" Dec 02 10:25:35 crc kubenswrapper[4781]: I1202 10:25:35.821853 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_18fa078a-6d45-40cf-a39e-139d84f86f76/nova-api-api/0.log" Dec 02 10:25:35 crc kubenswrapper[4781]: I1202 10:25:35.913299 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_cde74fe6-799b-4da8-974d-3fefd2af69aa/nova-cell1-conductor-conductor/0.log" Dec 02 10:25:35 crc kubenswrapper[4781]: I1202 10:25:35.948947 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68/nova-cell1-novncproxy-novncproxy/0.log" Dec 02 10:25:36 crc kubenswrapper[4781]: I1202 10:25:36.249265 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-g88rg_dacd1d88-ff6e-4719-a24f-4feb0559f463/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:25:36 crc kubenswrapper[4781]: I1202 10:25:36.444678 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b701c328-f693-4c11-96a7-5ff5b9bff2c1/nova-metadata-log/0.log" Dec 02 10:25:36 crc kubenswrapper[4781]: I1202 10:25:36.703844 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_05cec06e-b8e4-487d-b4f8-0691aaf1f997/mysql-bootstrap/0.log" Dec 02 10:25:36 crc kubenswrapper[4781]: I1202 10:25:36.776457 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a4460f62-792a-4297-b92b-3fe1081bc006/nova-scheduler-scheduler/0.log" Dec 02 10:25:36 crc kubenswrapper[4781]: I1202 10:25:36.917423 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_05cec06e-b8e4-487d-b4f8-0691aaf1f997/mysql-bootstrap/0.log" Dec 02 10:25:36 crc kubenswrapper[4781]: I1202 10:25:36.973482 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_05cec06e-b8e4-487d-b4f8-0691aaf1f997/galera/0.log" Dec 02 10:25:37 crc kubenswrapper[4781]: I1202 10:25:37.130071 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_39428d69-6b86-41ea-8c4f-5532a5283a91/mysql-bootstrap/0.log" Dec 02 10:25:37 crc kubenswrapper[4781]: I1202 10:25:37.405164 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_39428d69-6b86-41ea-8c4f-5532a5283a91/mysql-bootstrap/0.log" Dec 02 10:25:37 crc kubenswrapper[4781]: I1202 10:25:37.551228 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_39428d69-6b86-41ea-8c4f-5532a5283a91/galera/0.log" Dec 02 10:25:37 crc kubenswrapper[4781]: I1202 10:25:37.623613 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b701c328-f693-4c11-96a7-5ff5b9bff2c1/nova-metadata-metadata/0.log" Dec 02 10:25:37 crc kubenswrapper[4781]: I1202 10:25:37.647130 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_7a424309-12e4-42f9-ba35-d61f1f6c7b44/openstackclient/0.log" Dec 02 10:25:37 crc kubenswrapper[4781]: I1202 10:25:37.772091 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7c2h7_7bc2040b-e9d5-4a1a-9d46-6b50dbc71061/openstack-network-exporter/0.log" Dec 02 10:25:37 crc kubenswrapper[4781]: I1202 10:25:37.911309 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-mmhrv_7cf924dd-8243-4263-85a2-68ac01fd5346/ovn-controller/0.log" Dec 02 10:25:38 crc kubenswrapper[4781]: I1202 10:25:38.018499 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rh5wv_77d1da39-d5b0-4ec8-8196-f4f1025291f8/ovsdb-server-init/0.log" Dec 02 10:25:38 crc kubenswrapper[4781]: I1202 10:25:38.183099 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rh5wv_77d1da39-d5b0-4ec8-8196-f4f1025291f8/ovs-vswitchd/0.log" Dec 02 10:25:38 crc kubenswrapper[4781]: I1202 10:25:38.228612 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rh5wv_77d1da39-d5b0-4ec8-8196-f4f1025291f8/ovsdb-server-init/0.log" Dec 02 10:25:38 crc kubenswrapper[4781]: I1202 10:25:38.304470 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rh5wv_77d1da39-d5b0-4ec8-8196-f4f1025291f8/ovsdb-server/0.log" Dec 02 10:25:38 crc kubenswrapper[4781]: I1202 10:25:38.407593 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-nf4nh_809035c6-50b2-4492-898e-1f2917e62a5c/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:25:38 crc kubenswrapper[4781]: I1202 10:25:38.479684 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1d0908af-30ab-4017-8911-b10c3742336e/ovn-northd/0.log" Dec 02 10:25:38 crc kubenswrapper[4781]: I1202 10:25:38.530045 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1d0908af-30ab-4017-8911-b10c3742336e/openstack-network-exporter/0.log" Dec 02 10:25:38 crc kubenswrapper[4781]: I1202 10:25:38.684048 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6dea0cb6-7707-46ba-bd47-89ce579fdad9/openstack-network-exporter/0.log" Dec 02 10:25:38 crc kubenswrapper[4781]: I1202 10:25:38.747611 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6dea0cb6-7707-46ba-bd47-89ce579fdad9/ovsdbserver-nb/0.log" Dec 02 10:25:38 crc kubenswrapper[4781]: I1202 10:25:38.822394 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8c135676-b0d9-469f-82b2-59483c9712f1/openstack-network-exporter/0.log" Dec 02 10:25:38 crc kubenswrapper[4781]: I1202 10:25:38.897733 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8c135676-b0d9-469f-82b2-59483c9712f1/ovsdbserver-sb/0.log" Dec 02 10:25:39 crc kubenswrapper[4781]: I1202 10:25:39.090713 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-69459794d8-ph7dr_abd38c16-6eab-4f3e-9c4d-294b240fa154/placement-api/0.log" Dec 02 10:25:39 crc kubenswrapper[4781]: I1202 10:25:39.113761 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-69459794d8-ph7dr_abd38c16-6eab-4f3e-9c4d-294b240fa154/placement-log/0.log" Dec 02 10:25:39 crc kubenswrapper[4781]: I1202 10:25:39.185857 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6f7f1b54-7b32-4063-af85-f97785416d26/setup-container/0.log" Dec 02 10:25:39 crc kubenswrapper[4781]: I1202 10:25:39.391013 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6f7f1b54-7b32-4063-af85-f97785416d26/rabbitmq/0.log" Dec 02 10:25:39 crc kubenswrapper[4781]: I1202 10:25:39.394258 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6f7f1b54-7b32-4063-af85-f97785416d26/setup-container/0.log" Dec 02 10:25:39 crc kubenswrapper[4781]: I1202 10:25:39.419598 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4c72516d-29b4-4932-8a47-8838d686b176/setup-container/0.log" Dec 02 10:25:39 crc kubenswrapper[4781]: I1202 10:25:39.678512 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4c72516d-29b4-4932-8a47-8838d686b176/setup-container/0.log" Dec 02 10:25:39 crc kubenswrapper[4781]: I1202 10:25:39.706675 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd_e76bbee2-a10a-45f8-9767-4018dfa3836e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:25:39 crc kubenswrapper[4781]: I1202 10:25:39.835769 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4c72516d-29b4-4932-8a47-8838d686b176/rabbitmq/0.log" Dec 02 10:25:39 crc kubenswrapper[4781]: I1202 10:25:39.900830 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-5lb5h_b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:25:40 crc kubenswrapper[4781]: I1202 10:25:40.061659 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk_03ba130c-2463-4317-a907-e13c23657ae9/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:25:40 crc kubenswrapper[4781]: I1202 10:25:40.104945 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gbx4b_7687840f-e133-4b18-b37c-74664863276b/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:25:40 crc kubenswrapper[4781]: I1202 10:25:40.321115 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-cxdx6_6fe7b21a-17d2-432b-9045-e64643581770/ssh-known-hosts-edpm-deployment/0.log" Dec 02 10:25:40 crc kubenswrapper[4781]: I1202 10:25:40.461627 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7c5b6578c5-9cvpk_8666ba67-095e-4634-8975-e54bd7a0f0cb/proxy-server/0.log" Dec 02 10:25:40 crc kubenswrapper[4781]: I1202 10:25:40.533857 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7c5b6578c5-9cvpk_8666ba67-095e-4634-8975-e54bd7a0f0cb/proxy-httpd/0.log" Dec 02 10:25:40 crc kubenswrapper[4781]: I1202 10:25:40.557848 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-p9svs_d15a32bb-043f-4019-8ddb-4bcc54b243a0/swift-ring-rebalance/0.log" Dec 02 10:25:40 crc kubenswrapper[4781]: I1202 10:25:40.712120 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/account-reaper/0.log" Dec 02 10:25:40 crc kubenswrapper[4781]: I1202 10:25:40.715003 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/account-auditor/0.log" Dec 02 10:25:40 crc kubenswrapper[4781]: I1202 10:25:40.828823 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/account-replicator/0.log" Dec 02 10:25:40 crc kubenswrapper[4781]: I1202 10:25:40.872461 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/account-server/0.log" Dec 02 10:25:40 crc kubenswrapper[4781]: I1202 10:25:40.945173 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/container-auditor/0.log" Dec 02 10:25:40 crc kubenswrapper[4781]: I1202 10:25:40.968992 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/container-replicator/0.log" Dec 02 10:25:41 crc kubenswrapper[4781]: I1202 10:25:41.051006 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/container-server/0.log" Dec 02 10:25:41 crc kubenswrapper[4781]: I1202 10:25:41.054846 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/container-updater/0.log" Dec 02 10:25:41 crc kubenswrapper[4781]: I1202 10:25:41.155883 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/object-auditor/0.log" Dec 02 10:25:41 crc kubenswrapper[4781]: I1202 10:25:41.211359 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/object-expirer/0.log" Dec 02 10:25:41 crc kubenswrapper[4781]: I1202 10:25:41.278213 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/object-replicator/0.log" Dec 02 10:25:41 crc kubenswrapper[4781]: I1202 10:25:41.280081 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/object-server/0.log" Dec 02 10:25:41 crc kubenswrapper[4781]: I1202 10:25:41.334869 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/object-updater/0.log" Dec 02 10:25:41 crc kubenswrapper[4781]: I1202 10:25:41.471415 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/rsync/0.log" Dec 02 10:25:41 crc kubenswrapper[4781]: I1202 10:25:41.486886 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/swift-recon-cron/0.log" Dec 02 10:25:41 crc kubenswrapper[4781]: I1202 10:25:41.566539 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n_82d7f2c1-dd68-42f7-be56-30bc507b2bf5/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:25:41 crc kubenswrapper[4781]: I1202 10:25:41.753340 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ac589c1e-71f7-423f-b99b-3ebf175b40f3/tempest-tests-tempest-tests-runner/0.log" Dec 02 10:25:41 crc kubenswrapper[4781]: I1202 10:25:41.781164 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_45f4fae7-b5a7-4c3c-845e-4f2daed0a787/test-operator-logs-container/0.log" Dec 02 10:25:41 crc kubenswrapper[4781]: I1202 10:25:41.941960 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6_1a663620-4120-46ee-9676-3eaac5534b99/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:25:43 crc kubenswrapper[4781]: I1202 10:25:43.502725 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:25:43 crc kubenswrapper[4781]: E1202 10:25:43.503455 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:25:52 crc kubenswrapper[4781]: I1202 10:25:52.896604 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_2fd68438-412a-4745-9e59-f4c9374f2444/memcached/0.log" Dec 02 10:25:57 crc kubenswrapper[4781]: I1202 10:25:57.510494 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:25:57 crc kubenswrapper[4781]: E1202 10:25:57.511298 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:26:06 crc kubenswrapper[4781]: I1202 10:26:06.636475 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-8tkh8_f72fc870-291d-4800-a316-22de56b2ebbd/kube-rbac-proxy/0.log" Dec 02 10:26:06 crc kubenswrapper[4781]: I1202 10:26:06.714667 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-8tkh8_f72fc870-291d-4800-a316-22de56b2ebbd/manager/0.log" Dec 02 10:26:06 crc kubenswrapper[4781]: I1202 10:26:06.857783 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-jc6cp_812edfc0-b0b7-40c7-913d-b176bd6817f3/kube-rbac-proxy/0.log" Dec 02 10:26:06 crc kubenswrapper[4781]: I1202 10:26:06.948007 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-jc6cp_812edfc0-b0b7-40c7-913d-b176bd6817f3/manager/0.log" Dec 02 10:26:06 crc kubenswrapper[4781]: I1202 10:26:06.963082 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-7q6gr_eb3d207d-118c-42b8-9e9a-103a041a44b3/kube-rbac-proxy/0.log" Dec 02 10:26:07 crc kubenswrapper[4781]: I1202 10:26:07.074814 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-7q6gr_eb3d207d-118c-42b8-9e9a-103a041a44b3/manager/0.log" Dec 02 10:26:07 crc kubenswrapper[4781]: I1202 10:26:07.164701 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl_5da922a2-736d-4e49-b3f3-68adcbfc8d0b/util/0.log" Dec 02 10:26:07 crc kubenswrapper[4781]: I1202 10:26:07.347342 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl_5da922a2-736d-4e49-b3f3-68adcbfc8d0b/util/0.log" Dec 02 10:26:07 crc kubenswrapper[4781]: I1202 10:26:07.348069 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl_5da922a2-736d-4e49-b3f3-68adcbfc8d0b/pull/0.log" Dec 02 10:26:07 crc kubenswrapper[4781]: I1202 10:26:07.364341 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl_5da922a2-736d-4e49-b3f3-68adcbfc8d0b/pull/0.log" Dec 02 10:26:07 crc kubenswrapper[4781]: I1202 10:26:07.540731 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl_5da922a2-736d-4e49-b3f3-68adcbfc8d0b/util/0.log" Dec 02 10:26:07 crc kubenswrapper[4781]: I1202 10:26:07.549112 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl_5da922a2-736d-4e49-b3f3-68adcbfc8d0b/pull/0.log" Dec 02 10:26:07 crc kubenswrapper[4781]: I1202 10:26:07.565431 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl_5da922a2-736d-4e49-b3f3-68adcbfc8d0b/extract/0.log" Dec 02 10:26:08 crc kubenswrapper[4781]: I1202 10:26:08.128963 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-dz7tw_de57f174-daf9-483d-bac6-e735d25f9d64/kube-rbac-proxy/0.log" Dec 02 10:26:08 crc kubenswrapper[4781]: I1202 10:26:08.270384 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-wcvq5_0a36fb64-e101-44af-a6f9-91fb68fc1e7a/kube-rbac-proxy/0.log" Dec 02 10:26:08 crc kubenswrapper[4781]: I1202 10:26:08.281931 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-dz7tw_de57f174-daf9-483d-bac6-e735d25f9d64/manager/0.log" Dec 02 10:26:08 crc kubenswrapper[4781]: I1202 10:26:08.327724 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-wcvq5_0a36fb64-e101-44af-a6f9-91fb68fc1e7a/manager/0.log" Dec 02 10:26:08 crc kubenswrapper[4781]: I1202 10:26:08.467681 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-gplp6_15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89/kube-rbac-proxy/0.log" Dec 02 10:26:08 crc kubenswrapper[4781]: I1202 10:26:08.522587 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-gplp6_15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89/manager/0.log" Dec 02 10:26:08 crc kubenswrapper[4781]: I1202 10:26:08.637989 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-lwgs7_cf3e832e-6140-4880-9efd-017837fc9990/kube-rbac-proxy/0.log" Dec 02 10:26:08 crc kubenswrapper[4781]: I1202 10:26:08.767892 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-qpdqt_428e69aa-23f5-4d45-8c18-65ac62c6756c/kube-rbac-proxy/0.log" Dec 02 10:26:08 crc kubenswrapper[4781]: I1202 10:26:08.850521 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-lwgs7_cf3e832e-6140-4880-9efd-017837fc9990/manager/0.log" Dec 02 10:26:08 crc kubenswrapper[4781]: I1202 10:26:08.876478 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-qpdqt_428e69aa-23f5-4d45-8c18-65ac62c6756c/manager/0.log" Dec 02 10:26:08 crc kubenswrapper[4781]: I1202 10:26:08.987063 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-8mm6f_9d08e6b1-b9b9-4a7e-a859-e98f904e2588/kube-rbac-proxy/0.log" Dec 02 10:26:09 crc kubenswrapper[4781]: I1202 10:26:09.109296 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-8mm6f_9d08e6b1-b9b9-4a7e-a859-e98f904e2588/manager/0.log" Dec 02 10:26:09 crc kubenswrapper[4781]: I1202 10:26:09.194441 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-5lhcx_f1be99ff-4068-4454-b75d-770951e9fedd/kube-rbac-proxy/0.log" Dec 02 10:26:09 crc kubenswrapper[4781]: I1202 10:26:09.227868 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-5lhcx_f1be99ff-4068-4454-b75d-770951e9fedd/manager/0.log" Dec 02 10:26:09 crc kubenswrapper[4781]: I1202 10:26:09.344967 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-8xclz_c917a8ec-2bd5-4f7b-8948-a4bed859e01f/kube-rbac-proxy/0.log" Dec 02 10:26:09 crc kubenswrapper[4781]: I1202 10:26:09.488460 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-8xclz_c917a8ec-2bd5-4f7b-8948-a4bed859e01f/manager/0.log" Dec 02 10:26:09 crc kubenswrapper[4781]: I1202 10:26:09.496222 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-rf4bw_7b8c261d-133c-4a73-9424-3233e6701fff/kube-rbac-proxy/0.log" Dec 02 10:26:09 crc kubenswrapper[4781]: I1202 10:26:09.581062 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-rf4bw_7b8c261d-133c-4a73-9424-3233e6701fff/manager/0.log" Dec 02 10:26:09 crc kubenswrapper[4781]: I1202 10:26:09.743030 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-chbnm_71cfd08b-278c-4f9c-b0fd-198c662ef00d/manager/0.log" Dec 02 10:26:09 crc kubenswrapper[4781]: I1202 10:26:09.782880 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-chbnm_71cfd08b-278c-4f9c-b0fd-198c662ef00d/kube-rbac-proxy/0.log" Dec 02 10:26:09 crc kubenswrapper[4781]: I1202 10:26:09.791967 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-pkkst_bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff/kube-rbac-proxy/0.log" Dec 02 10:26:09 crc kubenswrapper[4781]: I1202 10:26:09.847470 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-pkkst_bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff/manager/0.log" Dec 02 10:26:09 crc kubenswrapper[4781]: I1202 10:26:09.982371 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t_cda1cc86-51ab-4070-96e9-98adba5d51c3/manager/0.log" Dec 02 10:26:09 crc kubenswrapper[4781]: I1202 10:26:09.984477 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t_cda1cc86-51ab-4070-96e9-98adba5d51c3/kube-rbac-proxy/0.log" Dec 02 10:26:10 crc kubenswrapper[4781]: I1202 10:26:10.293617 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-xg78j_0ac791f1-2459-4266-a082-498b66e549b4/registry-server/0.log" Dec 02 10:26:10 crc kubenswrapper[4781]: I1202 10:26:10.345243 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-65bb796c9b-zrllk_09f22b82-ec27-4398-b843-8be7661ed03a/operator/0.log" Dec 02 10:26:10 crc kubenswrapper[4781]: I1202 10:26:10.614982 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-fq4g5_15aa56a1-c9d3-4e48-a0fe-19e593320728/kube-rbac-proxy/0.log" Dec 02 10:26:10 crc kubenswrapper[4781]: I1202 10:26:10.652273 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-fq4g5_15aa56a1-c9d3-4e48-a0fe-19e593320728/manager/0.log" Dec 02 10:26:10 crc kubenswrapper[4781]: I1202 10:26:10.791251 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-8rtwl_ca4903e5-bed6-47c2-82a5-4376b162ec96/kube-rbac-proxy/0.log" Dec 02 10:26:10 crc kubenswrapper[4781]: I1202 10:26:10.901300 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-8rtwl_ca4903e5-bed6-47c2-82a5-4376b162ec96/manager/0.log" Dec 02 10:26:10 crc kubenswrapper[4781]: I1202 10:26:10.922084 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-7gsfd_8c93d2b8-28d1-4ea7-86aa-fe3f8ac35e61/operator/0.log" Dec 02 10:26:11 crc kubenswrapper[4781]: I1202 10:26:11.047415 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-68mk9_22dc2882-6dfd-4f1c-90ee-06ac4c9e0aa5/kube-rbac-proxy/0.log" Dec 02 10:26:11 crc kubenswrapper[4781]: I1202 10:26:11.123672 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-68mk9_22dc2882-6dfd-4f1c-90ee-06ac4c9e0aa5/manager/0.log" Dec 02 10:26:11 crc kubenswrapper[4781]: I1202 10:26:11.173427 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-686764c46-54r7r_090bc2d1-e1c5-4721-80ab-e20d4f3942c6/manager/0.log" Dec 02 10:26:11 crc kubenswrapper[4781]: I1202 10:26:11.226796 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-xtks6_cfd47a1e-a773-4479-9656-abb353f87fe9/kube-rbac-proxy/0.log" Dec 02 10:26:11 crc kubenswrapper[4781]: I1202 10:26:11.330934 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-xtks6_cfd47a1e-a773-4479-9656-abb353f87fe9/manager/0.log" Dec 02 10:26:11 crc kubenswrapper[4781]: I1202 10:26:11.383183 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-5d9tj_1c5a954b-d61b-4d33-a043-407f8de059a6/kube-rbac-proxy/0.log" Dec 02 10:26:11 crc kubenswrapper[4781]: I1202 10:26:11.403784 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-5d9tj_1c5a954b-d61b-4d33-a043-407f8de059a6/manager/0.log" Dec 02 10:26:11 crc kubenswrapper[4781]: I1202 10:26:11.499703 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:26:11 crc kubenswrapper[4781]: E1202 10:26:11.499975 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:26:11 crc kubenswrapper[4781]: I1202 10:26:11.501655 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-4dtmh_7f59a11b-4502-4c7b-94e7-3fbb6bac2222/kube-rbac-proxy/0.log" Dec 02 10:26:11 crc kubenswrapper[4781]: I1202 10:26:11.547226 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-4dtmh_7f59a11b-4502-4c7b-94e7-3fbb6bac2222/manager/0.log" Dec 02 10:26:23 crc kubenswrapper[4781]: I1202 10:26:23.500006 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:26:23 crc kubenswrapper[4781]: E1202 10:26:23.500751 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:26:30 crc kubenswrapper[4781]: I1202 10:26:30.969494 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-nm2f4_50d4d4c8-66e1-4d10-85cb-0fee6079d5fe/control-plane-machine-set-operator/0.log" Dec 02 10:26:31 crc kubenswrapper[4781]: I1202 10:26:31.093831 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-n9cmc_d9c52f13-f9c6-419e-8f69-ee91e29f4629/kube-rbac-proxy/0.log" Dec 02 10:26:31 crc kubenswrapper[4781]: I1202 10:26:31.126073 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-n9cmc_d9c52f13-f9c6-419e-8f69-ee91e29f4629/machine-api-operator/0.log" Dec 02 10:26:38 crc kubenswrapper[4781]: I1202 10:26:38.500001 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:26:38 crc kubenswrapper[4781]: E1202 10:26:38.500718 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:26:44 crc kubenswrapper[4781]: I1202 10:26:44.941437 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-nhlj8_e57350a9-3e11-4df0-a108-a9d8d446c219/cert-manager-controller/0.log" Dec 02 10:26:45 crc kubenswrapper[4781]: I1202 10:26:45.103578 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-5jxwz_bbeb0421-437b-4e00-a072-ebebf3354bea/cert-manager-cainjector/0.log" Dec 02 10:26:45 crc kubenswrapper[4781]: I1202 10:26:45.206507 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-zl2kl_753e9900-fde9-4486-9bc3-fce98f302367/cert-manager-webhook/0.log" Dec 02 10:26:51 crc kubenswrapper[4781]: I1202 10:26:51.500315 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:26:51 crc kubenswrapper[4781]: E1202 10:26:51.501559 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:26:59 crc kubenswrapper[4781]: I1202 10:26:59.824497 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-xqj9x_9f057938-ae0d-4ce5-a920-b34905dcab9a/nmstate-console-plugin/0.log" Dec 02 10:27:00 crc kubenswrapper[4781]: I1202 10:27:00.070696 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-h2jq9_acd034d2-af9c-49ff-a584-f7f0ef482c10/nmstate-handler/0.log" Dec 02 10:27:00 crc kubenswrapper[4781]: I1202 10:27:00.116356 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-b4bs8_71f173d9-47d0-4576-991f-0eeffb003596/kube-rbac-proxy/0.log" Dec 02 10:27:00 crc kubenswrapper[4781]: I1202 10:27:00.192320 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-b4bs8_71f173d9-47d0-4576-991f-0eeffb003596/nmstate-metrics/0.log" Dec 02 10:27:00 crc kubenswrapper[4781]: I1202 10:27:00.344746 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-99vk9_11d94cf1-4ea7-43cb-b7e5-6fc0be34760f/nmstate-operator/0.log" Dec 02 10:27:00 crc kubenswrapper[4781]: I1202 10:27:00.433424 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-dq5sv_de15e7ea-d9ce-4713-bb63-87db8b3c5afd/nmstate-webhook/0.log" Dec 02 10:27:06 crc kubenswrapper[4781]: I1202 10:27:06.500089 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:27:06 crc kubenswrapper[4781]: E1202 10:27:06.500969 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:27:16 crc kubenswrapper[4781]: I1202 10:27:16.659724 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-8fm55_b70a26b7-43cb-4e26-95c0-f67ef15a0c34/kube-rbac-proxy/0.log" Dec 02 10:27:16 crc kubenswrapper[4781]: I1202 10:27:16.797207 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-8fm55_b70a26b7-43cb-4e26-95c0-f67ef15a0c34/controller/0.log" Dec 02 10:27:16 crc kubenswrapper[4781]: I1202 10:27:16.917068 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-frr-files/0.log" Dec 02 10:27:17 crc kubenswrapper[4781]: I1202 10:27:17.337987 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-reloader/0.log" Dec 02 10:27:17 crc kubenswrapper[4781]: I1202 10:27:17.355231 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-metrics/0.log" Dec 02 10:27:17 crc kubenswrapper[4781]: I1202 10:27:17.371387 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-frr-files/0.log" Dec 02 10:27:17 crc kubenswrapper[4781]: I1202 10:27:17.536135 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-reloader/0.log" Dec 02 10:27:17 crc kubenswrapper[4781]: I1202 10:27:17.667369 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-frr-files/0.log" Dec 02 10:27:17 crc kubenswrapper[4781]: I1202 10:27:17.747567 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-metrics/0.log" Dec 02 10:27:17 crc kubenswrapper[4781]: I1202 10:27:17.765945 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-metrics/0.log" Dec 02 10:27:17 crc kubenswrapper[4781]: I1202 10:27:17.771691 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-reloader/0.log" Dec 02 10:27:17 crc kubenswrapper[4781]: I1202 10:27:17.942470 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/controller/0.log" Dec 02 10:27:17 crc kubenswrapper[4781]: I1202 10:27:17.948856 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-reloader/0.log" Dec 02 10:27:17 crc kubenswrapper[4781]: I1202 10:27:17.966458 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-metrics/0.log" Dec 02 10:27:17 crc kubenswrapper[4781]: I1202 10:27:17.985581 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-frr-files/0.log" Dec 02 10:27:18 crc kubenswrapper[4781]: I1202 10:27:18.151498 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/kube-rbac-proxy/0.log" Dec 02 10:27:18 crc kubenswrapper[4781]: I1202 10:27:18.182457 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/frr-metrics/0.log" Dec 02 10:27:18 crc kubenswrapper[4781]: I1202 10:27:18.215056 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/kube-rbac-proxy-frr/0.log" Dec 02 10:27:18 crc kubenswrapper[4781]: I1202 10:27:18.405464 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/reloader/0.log" Dec 02 10:27:18 crc kubenswrapper[4781]: I1202 10:27:18.423914 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-mfpkp_e4edb853-0703-424a-9701-bd01ffa5631c/frr-k8s-webhook-server/0.log" Dec 02 10:27:18 crc kubenswrapper[4781]: I1202 10:27:18.667533 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-75f8895565-ttkrd_60dee968-291d-4e9d-b2a5-40b67457b003/manager/0.log" Dec 02 10:27:18 crc kubenswrapper[4781]: I1202 10:27:18.879979 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-w7lhj_9033e241-ad62-4fcc-92f1-8499f42f6310/kube-rbac-proxy/0.log" Dec 02 10:27:18 crc kubenswrapper[4781]: I1202 10:27:18.919966 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6f95b97b7b-7p5dk_804d6dfe-f063-4edb-b276-9a386bae049a/webhook-server/0.log" Dec 02 10:27:19 crc kubenswrapper[4781]: I1202 10:27:19.499242 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:27:19 crc kubenswrapper[4781]: E1202 10:27:19.499752 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:27:19 crc kubenswrapper[4781]: I1202 10:27:19.541907 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-w7lhj_9033e241-ad62-4fcc-92f1-8499f42f6310/speaker/0.log" Dec 02 10:27:19 crc kubenswrapper[4781]: I1202 10:27:19.569008 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/frr/0.log" Dec 02 10:27:31 crc kubenswrapper[4781]: I1202 10:27:31.962727 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c_8add967d-359e-4b2f-8181-27a4e32cd3d1/util/0.log" Dec 02 10:27:32 crc kubenswrapper[4781]: I1202 10:27:32.160216 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c_8add967d-359e-4b2f-8181-27a4e32cd3d1/util/0.log" Dec 02 10:27:32 crc kubenswrapper[4781]: I1202 10:27:32.332044 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c_8add967d-359e-4b2f-8181-27a4e32cd3d1/pull/0.log" Dec 02 10:27:32 crc kubenswrapper[4781]: I1202 10:27:32.333134 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c_8add967d-359e-4b2f-8181-27a4e32cd3d1/pull/0.log" Dec 02 10:27:32 crc kubenswrapper[4781]: I1202 10:27:32.358525 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c_8add967d-359e-4b2f-8181-27a4e32cd3d1/pull/0.log" Dec 02 10:27:32 crc kubenswrapper[4781]: I1202 10:27:32.486652 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c_8add967d-359e-4b2f-8181-27a4e32cd3d1/util/0.log" Dec 02 10:27:32 crc kubenswrapper[4781]: I1202 10:27:32.499992 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:27:32 crc kubenswrapper[4781]: I1202 10:27:32.542144 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c_8add967d-359e-4b2f-8181-27a4e32cd3d1/extract/0.log" Dec 02 10:27:32 crc kubenswrapper[4781]: I1202 10:27:32.575039 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt_a30d0113-e2bc-4a14-b7b0-49363876c6b2/util/0.log" Dec 02 10:27:32 crc kubenswrapper[4781]: I1202 10:27:32.756390 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt_a30d0113-e2bc-4a14-b7b0-49363876c6b2/util/0.log" Dec 02 10:27:32 crc kubenswrapper[4781]: I1202 10:27:32.762137 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt_a30d0113-e2bc-4a14-b7b0-49363876c6b2/pull/0.log" Dec 02 10:27:32 crc kubenswrapper[4781]: I1202 10:27:32.777994 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt_a30d0113-e2bc-4a14-b7b0-49363876c6b2/pull/0.log" Dec 02 10:27:32 crc kubenswrapper[4781]: I1202 10:27:32.947559 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"11b83b3689380a7a0a856185b9c3f2221828cd55a83586474c6d42332e35e9f8"} Dec 02 10:27:33 crc kubenswrapper[4781]: I1202 10:27:33.021168 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt_a30d0113-e2bc-4a14-b7b0-49363876c6b2/pull/0.log" Dec 02 10:27:33 crc kubenswrapper[4781]: I1202 10:27:33.030776 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt_a30d0113-e2bc-4a14-b7b0-49363876c6b2/util/0.log" Dec 02 10:27:33 crc kubenswrapper[4781]: I1202 10:27:33.044239 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt_a30d0113-e2bc-4a14-b7b0-49363876c6b2/extract/0.log" Dec 02 10:27:33 crc kubenswrapper[4781]: I1202 10:27:33.188384 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhncm_8267358b-15b0-44fd-bd0a-73438bcd7ded/extract-utilities/0.log" Dec 02 10:27:33 crc kubenswrapper[4781]: I1202 10:27:33.372022 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhncm_8267358b-15b0-44fd-bd0a-73438bcd7ded/extract-content/0.log" Dec 02 10:27:33 crc kubenswrapper[4781]: I1202 10:27:33.407454 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhncm_8267358b-15b0-44fd-bd0a-73438bcd7ded/extract-utilities/0.log" Dec 02 10:27:33 crc kubenswrapper[4781]: I1202 10:27:33.437397 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhncm_8267358b-15b0-44fd-bd0a-73438bcd7ded/extract-content/0.log" Dec 02 10:27:33 crc kubenswrapper[4781]: I1202 10:27:33.619794 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhncm_8267358b-15b0-44fd-bd0a-73438bcd7ded/extract-utilities/0.log" Dec 02 10:27:33 crc kubenswrapper[4781]: I1202 10:27:33.728080 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhncm_8267358b-15b0-44fd-bd0a-73438bcd7ded/extract-content/0.log" Dec 02 10:27:33 crc kubenswrapper[4781]: I1202 10:27:33.761664 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhncm_8267358b-15b0-44fd-bd0a-73438bcd7ded/registry-server/0.log" Dec 02 10:27:33 crc kubenswrapper[4781]: I1202 10:27:33.821306 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rv6xb_10e7b109-cde2-4b4f-8f1c-2d7940a49e1c/extract-utilities/0.log" Dec 02 10:27:34 crc kubenswrapper[4781]: I1202 10:27:34.042916 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rv6xb_10e7b109-cde2-4b4f-8f1c-2d7940a49e1c/extract-utilities/0.log" Dec 02 10:27:34 crc kubenswrapper[4781]: I1202 10:27:34.049629 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rv6xb_10e7b109-cde2-4b4f-8f1c-2d7940a49e1c/extract-content/0.log" Dec 02 10:27:34 crc kubenswrapper[4781]: I1202 10:27:34.053597 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rv6xb_10e7b109-cde2-4b4f-8f1c-2d7940a49e1c/extract-content/0.log" Dec 02 10:27:34 crc kubenswrapper[4781]: I1202 10:27:34.188004 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rv6xb_10e7b109-cde2-4b4f-8f1c-2d7940a49e1c/extract-utilities/0.log" Dec 02 10:27:34 crc kubenswrapper[4781]: I1202 10:27:34.216257 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rv6xb_10e7b109-cde2-4b4f-8f1c-2d7940a49e1c/extract-content/0.log" Dec 02 10:27:34 crc kubenswrapper[4781]: I1202 10:27:34.426968 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-lx2rp_060db2b9-0086-4429-8ade-2156f94455f4/marketplace-operator/0.log" Dec 02 10:27:34 crc kubenswrapper[4781]: I1202 10:27:34.601942 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rv6xb_10e7b109-cde2-4b4f-8f1c-2d7940a49e1c/registry-server/0.log" Dec 02 10:27:35 crc kubenswrapper[4781]: I1202 10:27:35.145131 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6s4mn_1996ded9-7aca-4ca6-b787-1c688c678893/extract-utilities/0.log" Dec 02 10:27:35 crc kubenswrapper[4781]: I1202 10:27:35.307524 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6s4mn_1996ded9-7aca-4ca6-b787-1c688c678893/extract-utilities/0.log" Dec 02 10:27:35 crc kubenswrapper[4781]: I1202 10:27:35.310708 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6s4mn_1996ded9-7aca-4ca6-b787-1c688c678893/extract-content/0.log" Dec 02 10:27:35 crc kubenswrapper[4781]: I1202 10:27:35.340584 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6s4mn_1996ded9-7aca-4ca6-b787-1c688c678893/extract-content/0.log" Dec 02 10:27:35 crc kubenswrapper[4781]: I1202 10:27:35.486009 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6s4mn_1996ded9-7aca-4ca6-b787-1c688c678893/extract-utilities/0.log" Dec 02 10:27:35 crc kubenswrapper[4781]: I1202 10:27:35.490622 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6s4mn_1996ded9-7aca-4ca6-b787-1c688c678893/extract-content/0.log" Dec 02 10:27:35 crc kubenswrapper[4781]: I1202 10:27:35.645064 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6s4mn_1996ded9-7aca-4ca6-b787-1c688c678893/registry-server/0.log" Dec 02 10:27:35 crc kubenswrapper[4781]: I1202 10:27:35.657885 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w2wcs_5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4/extract-utilities/0.log" Dec 02 10:27:35 crc kubenswrapper[4781]: I1202 10:27:35.885499 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w2wcs_5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4/extract-utilities/0.log" Dec 02 10:27:35 crc kubenswrapper[4781]: I1202 10:27:35.903436 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w2wcs_5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4/extract-content/0.log" Dec 02 10:27:35 crc kubenswrapper[4781]: I1202 10:27:35.942532 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w2wcs_5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4/extract-content/0.log" Dec 02 10:27:36 crc kubenswrapper[4781]: I1202 10:27:36.103830 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w2wcs_5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4/extract-utilities/0.log" Dec 02 10:27:36 crc kubenswrapper[4781]: I1202 10:27:36.142082 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w2wcs_5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4/extract-content/0.log" Dec 02 10:27:36 crc kubenswrapper[4781]: I1202 10:27:36.639753 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w2wcs_5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4/registry-server/0.log" Dec 02 10:29:16 crc kubenswrapper[4781]: I1202 10:29:16.988227 4781 generic.go:334] "Generic (PLEG): container finished" podID="f89edb07-26fd-490c-889b-8612f8d4ed68" containerID="29006a3908e026567d81359f3a6b4653f56a2b36a7ce76e7225ec1fda35bbcc8" exitCode=0 Dec 02 10:29:16 crc kubenswrapper[4781]: I1202 10:29:16.988419 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5ldq/must-gather-m5gsv" event={"ID":"f89edb07-26fd-490c-889b-8612f8d4ed68","Type":"ContainerDied","Data":"29006a3908e026567d81359f3a6b4653f56a2b36a7ce76e7225ec1fda35bbcc8"} Dec 02 10:29:16 crc kubenswrapper[4781]: I1202 10:29:16.992038 4781 scope.go:117] "RemoveContainer" containerID="29006a3908e026567d81359f3a6b4653f56a2b36a7ce76e7225ec1fda35bbcc8" Dec 02 10:29:17 crc kubenswrapper[4781]: I1202 10:29:17.243228 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n5ldq_must-gather-m5gsv_f89edb07-26fd-490c-889b-8612f8d4ed68/gather/0.log" Dec 02 10:29:25 crc kubenswrapper[4781]: I1202 10:29:25.533637 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n5ldq/must-gather-m5gsv"] Dec 02 10:29:25 crc kubenswrapper[4781]: I1202 10:29:25.534435 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-n5ldq/must-gather-m5gsv" podUID="f89edb07-26fd-490c-889b-8612f8d4ed68" containerName="copy" containerID="cri-o://c1c90d5e0a938a9becaf86a86ed99d115a2d9867fe14ddcc6a6bbca5c2b98bc7" gracePeriod=2 Dec 02 10:29:25 crc kubenswrapper[4781]: I1202 10:29:25.540966 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n5ldq/must-gather-m5gsv"] Dec 02 10:29:26 crc kubenswrapper[4781]: I1202 10:29:26.061055 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n5ldq_must-gather-m5gsv_f89edb07-26fd-490c-889b-8612f8d4ed68/copy/0.log" Dec 02 10:29:26 crc kubenswrapper[4781]: I1202 10:29:26.061824 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5ldq/must-gather-m5gsv" Dec 02 10:29:26 crc kubenswrapper[4781]: I1202 10:29:26.087648 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n5ldq_must-gather-m5gsv_f89edb07-26fd-490c-889b-8612f8d4ed68/copy/0.log" Dec 02 10:29:26 crc kubenswrapper[4781]: I1202 10:29:26.088111 4781 generic.go:334] "Generic (PLEG): container finished" podID="f89edb07-26fd-490c-889b-8612f8d4ed68" containerID="c1c90d5e0a938a9becaf86a86ed99d115a2d9867fe14ddcc6a6bbca5c2b98bc7" exitCode=143 Dec 02 10:29:26 crc kubenswrapper[4781]: I1202 10:29:26.088160 4781 scope.go:117] "RemoveContainer" containerID="c1c90d5e0a938a9becaf86a86ed99d115a2d9867fe14ddcc6a6bbca5c2b98bc7" Dec 02 10:29:26 crc kubenswrapper[4781]: I1202 10:29:26.088183 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5ldq/must-gather-m5gsv" Dec 02 10:29:26 crc kubenswrapper[4781]: I1202 10:29:26.114202 4781 scope.go:117] "RemoveContainer" containerID="29006a3908e026567d81359f3a6b4653f56a2b36a7ce76e7225ec1fda35bbcc8" Dec 02 10:29:26 crc kubenswrapper[4781]: I1202 10:29:26.180080 4781 scope.go:117] "RemoveContainer" containerID="c1c90d5e0a938a9becaf86a86ed99d115a2d9867fe14ddcc6a6bbca5c2b98bc7" Dec 02 10:29:26 crc kubenswrapper[4781]: E1202 10:29:26.180532 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1c90d5e0a938a9becaf86a86ed99d115a2d9867fe14ddcc6a6bbca5c2b98bc7\": container with ID starting with c1c90d5e0a938a9becaf86a86ed99d115a2d9867fe14ddcc6a6bbca5c2b98bc7 not found: ID does not exist" containerID="c1c90d5e0a938a9becaf86a86ed99d115a2d9867fe14ddcc6a6bbca5c2b98bc7" Dec 02 10:29:26 crc kubenswrapper[4781]: I1202 10:29:26.180560 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1c90d5e0a938a9becaf86a86ed99d115a2d9867fe14ddcc6a6bbca5c2b98bc7"} err="failed to get container status \"c1c90d5e0a938a9becaf86a86ed99d115a2d9867fe14ddcc6a6bbca5c2b98bc7\": rpc error: code = NotFound desc = could not find container \"c1c90d5e0a938a9becaf86a86ed99d115a2d9867fe14ddcc6a6bbca5c2b98bc7\": container with ID starting with c1c90d5e0a938a9becaf86a86ed99d115a2d9867fe14ddcc6a6bbca5c2b98bc7 not found: ID does not exist" Dec 02 10:29:26 crc kubenswrapper[4781]: I1202 10:29:26.180584 4781 scope.go:117] "RemoveContainer" containerID="29006a3908e026567d81359f3a6b4653f56a2b36a7ce76e7225ec1fda35bbcc8" Dec 02 10:29:26 crc kubenswrapper[4781]: E1202 10:29:26.180800 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29006a3908e026567d81359f3a6b4653f56a2b36a7ce76e7225ec1fda35bbcc8\": container with ID starting with 29006a3908e026567d81359f3a6b4653f56a2b36a7ce76e7225ec1fda35bbcc8 not found: ID does not exist" containerID="29006a3908e026567d81359f3a6b4653f56a2b36a7ce76e7225ec1fda35bbcc8" Dec 02 10:29:26 crc kubenswrapper[4781]: I1202 10:29:26.180816 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29006a3908e026567d81359f3a6b4653f56a2b36a7ce76e7225ec1fda35bbcc8"} err="failed to get container status \"29006a3908e026567d81359f3a6b4653f56a2b36a7ce76e7225ec1fda35bbcc8\": rpc error: code = NotFound desc = could not find container \"29006a3908e026567d81359f3a6b4653f56a2b36a7ce76e7225ec1fda35bbcc8\": container with ID starting with 29006a3908e026567d81359f3a6b4653f56a2b36a7ce76e7225ec1fda35bbcc8 not found: ID does not exist" Dec 02 10:29:26 crc kubenswrapper[4781]: I1202 10:29:26.246611 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f89edb07-26fd-490c-889b-8612f8d4ed68-must-gather-output\") pod \"f89edb07-26fd-490c-889b-8612f8d4ed68\" (UID: \"f89edb07-26fd-490c-889b-8612f8d4ed68\") " Dec 02 10:29:26 crc kubenswrapper[4781]: I1202 10:29:26.246709 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz5g9\" (UniqueName: \"kubernetes.io/projected/f89edb07-26fd-490c-889b-8612f8d4ed68-kube-api-access-bz5g9\") pod \"f89edb07-26fd-490c-889b-8612f8d4ed68\" (UID: \"f89edb07-26fd-490c-889b-8612f8d4ed68\") " Dec 02 10:29:26 crc kubenswrapper[4781]: I1202 10:29:26.252824 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f89edb07-26fd-490c-889b-8612f8d4ed68-kube-api-access-bz5g9" (OuterVolumeSpecName: "kube-api-access-bz5g9") pod "f89edb07-26fd-490c-889b-8612f8d4ed68" (UID: "f89edb07-26fd-490c-889b-8612f8d4ed68"). InnerVolumeSpecName "kube-api-access-bz5g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:29:26 crc kubenswrapper[4781]: I1202 10:29:26.349238 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz5g9\" (UniqueName: \"kubernetes.io/projected/f89edb07-26fd-490c-889b-8612f8d4ed68-kube-api-access-bz5g9\") on node \"crc\" DevicePath \"\"" Dec 02 10:29:26 crc kubenswrapper[4781]: I1202 10:29:26.387179 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f89edb07-26fd-490c-889b-8612f8d4ed68-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f89edb07-26fd-490c-889b-8612f8d4ed68" (UID: "f89edb07-26fd-490c-889b-8612f8d4ed68"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:29:26 crc kubenswrapper[4781]: I1202 10:29:26.452072 4781 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f89edb07-26fd-490c-889b-8612f8d4ed68-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 02 10:29:27 crc kubenswrapper[4781]: I1202 10:29:27.517780 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f89edb07-26fd-490c-889b-8612f8d4ed68" path="/var/lib/kubelet/pods/f89edb07-26fd-490c-889b-8612f8d4ed68/volumes" Dec 02 10:29:36 crc kubenswrapper[4781]: I1202 10:29:36.738640 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ptckk"] Dec 02 10:29:36 crc kubenswrapper[4781]: E1202 10:29:36.739521 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbc75166-d9b3-4162-9511-b8df894bb552" containerName="container-00" Dec 02 10:29:36 crc kubenswrapper[4781]: I1202 10:29:36.739534 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbc75166-d9b3-4162-9511-b8df894bb552" containerName="container-00" Dec 02 10:29:36 crc kubenswrapper[4781]: E1202 10:29:36.739577 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f89edb07-26fd-490c-889b-8612f8d4ed68" containerName="gather" Dec 02 10:29:36 crc kubenswrapper[4781]: I1202 10:29:36.739583 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89edb07-26fd-490c-889b-8612f8d4ed68" containerName="gather" Dec 02 10:29:36 crc kubenswrapper[4781]: E1202 10:29:36.739595 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f89edb07-26fd-490c-889b-8612f8d4ed68" containerName="copy" Dec 02 10:29:36 crc kubenswrapper[4781]: I1202 10:29:36.739601 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89edb07-26fd-490c-889b-8612f8d4ed68" containerName="copy" Dec 02 10:29:36 crc kubenswrapper[4781]: I1202 10:29:36.739762 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f89edb07-26fd-490c-889b-8612f8d4ed68" containerName="copy" Dec 02 10:29:36 crc kubenswrapper[4781]: I1202 10:29:36.739780 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f89edb07-26fd-490c-889b-8612f8d4ed68" containerName="gather" Dec 02 10:29:36 crc kubenswrapper[4781]: I1202 10:29:36.739790 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbc75166-d9b3-4162-9511-b8df894bb552" containerName="container-00" Dec 02 10:29:36 crc kubenswrapper[4781]: I1202 10:29:36.741582 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptckk" Dec 02 10:29:36 crc kubenswrapper[4781]: I1202 10:29:36.751231 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptckk"] Dec 02 10:29:36 crc kubenswrapper[4781]: I1202 10:29:36.875177 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fedf909-efba-48ff-af33-1c6e957ee3ca-utilities\") pod \"redhat-marketplace-ptckk\" (UID: \"9fedf909-efba-48ff-af33-1c6e957ee3ca\") " pod="openshift-marketplace/redhat-marketplace-ptckk" Dec 02 10:29:36 crc kubenswrapper[4781]: I1202 10:29:36.875235 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fedf909-efba-48ff-af33-1c6e957ee3ca-catalog-content\") pod \"redhat-marketplace-ptckk\" (UID: \"9fedf909-efba-48ff-af33-1c6e957ee3ca\") " pod="openshift-marketplace/redhat-marketplace-ptckk" Dec 02 10:29:36 crc kubenswrapper[4781]: I1202 10:29:36.875269 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q2jz\" (UniqueName: \"kubernetes.io/projected/9fedf909-efba-48ff-af33-1c6e957ee3ca-kube-api-access-4q2jz\") pod \"redhat-marketplace-ptckk\" (UID: \"9fedf909-efba-48ff-af33-1c6e957ee3ca\") " pod="openshift-marketplace/redhat-marketplace-ptckk" Dec 02 10:29:36 crc kubenswrapper[4781]: I1202 10:29:36.977264 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fedf909-efba-48ff-af33-1c6e957ee3ca-utilities\") pod \"redhat-marketplace-ptckk\" (UID: \"9fedf909-efba-48ff-af33-1c6e957ee3ca\") " pod="openshift-marketplace/redhat-marketplace-ptckk" Dec 02 10:29:36 crc kubenswrapper[4781]: I1202 10:29:36.977317 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fedf909-efba-48ff-af33-1c6e957ee3ca-catalog-content\") pod \"redhat-marketplace-ptckk\" (UID: \"9fedf909-efba-48ff-af33-1c6e957ee3ca\") " pod="openshift-marketplace/redhat-marketplace-ptckk" Dec 02 10:29:36 crc kubenswrapper[4781]: I1202 10:29:36.977348 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q2jz\" (UniqueName: \"kubernetes.io/projected/9fedf909-efba-48ff-af33-1c6e957ee3ca-kube-api-access-4q2jz\") pod \"redhat-marketplace-ptckk\" (UID: \"9fedf909-efba-48ff-af33-1c6e957ee3ca\") " pod="openshift-marketplace/redhat-marketplace-ptckk" Dec 02 10:29:36 crc kubenswrapper[4781]: I1202 10:29:36.977681 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fedf909-efba-48ff-af33-1c6e957ee3ca-utilities\") pod \"redhat-marketplace-ptckk\" (UID: \"9fedf909-efba-48ff-af33-1c6e957ee3ca\") " pod="openshift-marketplace/redhat-marketplace-ptckk" Dec 02 10:29:36 crc kubenswrapper[4781]: I1202 10:29:36.978122 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fedf909-efba-48ff-af33-1c6e957ee3ca-catalog-content\") pod \"redhat-marketplace-ptckk\" (UID: \"9fedf909-efba-48ff-af33-1c6e957ee3ca\") " pod="openshift-marketplace/redhat-marketplace-ptckk" Dec 02 10:29:37 crc kubenswrapper[4781]: I1202 10:29:36.999908 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q2jz\" (UniqueName: \"kubernetes.io/projected/9fedf909-efba-48ff-af33-1c6e957ee3ca-kube-api-access-4q2jz\") pod \"redhat-marketplace-ptckk\" (UID: \"9fedf909-efba-48ff-af33-1c6e957ee3ca\") " pod="openshift-marketplace/redhat-marketplace-ptckk" Dec 02 10:29:37 crc kubenswrapper[4781]: I1202 10:29:37.067060 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptckk" Dec 02 10:29:37 crc kubenswrapper[4781]: I1202 10:29:37.537220 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptckk"] Dec 02 10:29:38 crc kubenswrapper[4781]: I1202 10:29:38.200212 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptckk" event={"ID":"9fedf909-efba-48ff-af33-1c6e957ee3ca","Type":"ContainerStarted","Data":"c47e09eb16efab3b06c92e76c2092429c5bdd8456a8a569bc4d4ba13303a7355"} Dec 02 10:29:40 crc kubenswrapper[4781]: I1202 10:29:40.224052 4781 generic.go:334] "Generic (PLEG): container finished" podID="9fedf909-efba-48ff-af33-1c6e957ee3ca" containerID="eb4109d8ee84d53233be0ebcc451d6d38f9c6f59e1e1b1f4a2037b986555a5eb" exitCode=0 Dec 02 10:29:40 crc kubenswrapper[4781]: I1202 10:29:40.224175 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptckk" event={"ID":"9fedf909-efba-48ff-af33-1c6e957ee3ca","Type":"ContainerDied","Data":"eb4109d8ee84d53233be0ebcc451d6d38f9c6f59e1e1b1f4a2037b986555a5eb"} Dec 02 10:29:40 crc kubenswrapper[4781]: I1202 10:29:40.227244 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 10:29:42 crc kubenswrapper[4781]: I1202 10:29:42.248785 4781 generic.go:334] "Generic (PLEG): container finished" podID="9fedf909-efba-48ff-af33-1c6e957ee3ca" containerID="7f14e964a2667110ccdbc819d46816e31b844abf6f551cac6b8e96b17460ac0e" exitCode=0 Dec 02 10:29:42 crc kubenswrapper[4781]: I1202 10:29:42.249154 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptckk" event={"ID":"9fedf909-efba-48ff-af33-1c6e957ee3ca","Type":"ContainerDied","Data":"7f14e964a2667110ccdbc819d46816e31b844abf6f551cac6b8e96b17460ac0e"} Dec 02 10:29:43 crc kubenswrapper[4781]: I1202 10:29:43.275238 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptckk" event={"ID":"9fedf909-efba-48ff-af33-1c6e957ee3ca","Type":"ContainerStarted","Data":"d0b7fdf0c20468825ca6ca35da31355ff7806e7a64873700a9ab54b80ce3f42e"} Dec 02 10:29:43 crc kubenswrapper[4781]: I1202 10:29:43.303858 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ptckk" podStartSLOduration=4.843712268 podStartE2EDuration="7.303841432s" podCreationTimestamp="2025-12-02 10:29:36 +0000 UTC" firstStartedPulling="2025-12-02 10:29:40.226739305 +0000 UTC m=+4143.050613224" lastFinishedPulling="2025-12-02 10:29:42.686868489 +0000 UTC m=+4145.510742388" observedRunningTime="2025-12-02 10:29:43.297608625 +0000 UTC m=+4146.121482504" watchObservedRunningTime="2025-12-02 10:29:43.303841432 +0000 UTC m=+4146.127715311" Dec 02 10:29:47 crc kubenswrapper[4781]: I1202 10:29:47.067987 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ptckk" Dec 02 10:29:47 crc kubenswrapper[4781]: I1202 10:29:47.068905 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ptckk" Dec 02 10:29:47 crc kubenswrapper[4781]: I1202 10:29:47.155158 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ptckk" Dec 02 10:29:47 crc kubenswrapper[4781]: I1202 10:29:47.352790 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ptckk" Dec 02 10:29:47 crc kubenswrapper[4781]: I1202 10:29:47.409737 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptckk"] Dec 02 10:29:49 crc kubenswrapper[4781]: I1202 10:29:49.326829 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ptckk" podUID="9fedf909-efba-48ff-af33-1c6e957ee3ca" containerName="registry-server" containerID="cri-o://d0b7fdf0c20468825ca6ca35da31355ff7806e7a64873700a9ab54b80ce3f42e" gracePeriod=2 Dec 02 10:29:50 crc kubenswrapper[4781]: I1202 10:29:50.341860 4781 generic.go:334] "Generic (PLEG): container finished" podID="9fedf909-efba-48ff-af33-1c6e957ee3ca" containerID="d0b7fdf0c20468825ca6ca35da31355ff7806e7a64873700a9ab54b80ce3f42e" exitCode=0 Dec 02 10:29:50 crc kubenswrapper[4781]: I1202 10:29:50.341996 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptckk" event={"ID":"9fedf909-efba-48ff-af33-1c6e957ee3ca","Type":"ContainerDied","Data":"d0b7fdf0c20468825ca6ca35da31355ff7806e7a64873700a9ab54b80ce3f42e"} Dec 02 10:29:50 crc kubenswrapper[4781]: I1202 10:29:50.779721 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptckk" Dec 02 10:29:50 crc kubenswrapper[4781]: I1202 10:29:50.810565 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fedf909-efba-48ff-af33-1c6e957ee3ca-utilities\") pod \"9fedf909-efba-48ff-af33-1c6e957ee3ca\" (UID: \"9fedf909-efba-48ff-af33-1c6e957ee3ca\") " Dec 02 10:29:50 crc kubenswrapper[4781]: I1202 10:29:50.810631 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fedf909-efba-48ff-af33-1c6e957ee3ca-catalog-content\") pod \"9fedf909-efba-48ff-af33-1c6e957ee3ca\" (UID: \"9fedf909-efba-48ff-af33-1c6e957ee3ca\") " Dec 02 10:29:50 crc kubenswrapper[4781]: I1202 10:29:50.810669 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q2jz\" (UniqueName: \"kubernetes.io/projected/9fedf909-efba-48ff-af33-1c6e957ee3ca-kube-api-access-4q2jz\") pod \"9fedf909-efba-48ff-af33-1c6e957ee3ca\" (UID: \"9fedf909-efba-48ff-af33-1c6e957ee3ca\") " Dec 02 10:29:50 crc kubenswrapper[4781]: I1202 10:29:50.812738 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fedf909-efba-48ff-af33-1c6e957ee3ca-utilities" (OuterVolumeSpecName: "utilities") pod "9fedf909-efba-48ff-af33-1c6e957ee3ca" (UID: "9fedf909-efba-48ff-af33-1c6e957ee3ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:29:50 crc kubenswrapper[4781]: I1202 10:29:50.817694 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fedf909-efba-48ff-af33-1c6e957ee3ca-kube-api-access-4q2jz" (OuterVolumeSpecName: "kube-api-access-4q2jz") pod "9fedf909-efba-48ff-af33-1c6e957ee3ca" (UID: "9fedf909-efba-48ff-af33-1c6e957ee3ca"). InnerVolumeSpecName "kube-api-access-4q2jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:29:50 crc kubenswrapper[4781]: I1202 10:29:50.832587 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fedf909-efba-48ff-af33-1c6e957ee3ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fedf909-efba-48ff-af33-1c6e957ee3ca" (UID: "9fedf909-efba-48ff-af33-1c6e957ee3ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:29:50 crc kubenswrapper[4781]: I1202 10:29:50.913267 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fedf909-efba-48ff-af33-1c6e957ee3ca-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:29:50 crc kubenswrapper[4781]: I1202 10:29:50.913298 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fedf909-efba-48ff-af33-1c6e957ee3ca-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:29:50 crc kubenswrapper[4781]: I1202 10:29:50.913308 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q2jz\" (UniqueName: \"kubernetes.io/projected/9fedf909-efba-48ff-af33-1c6e957ee3ca-kube-api-access-4q2jz\") on node \"crc\" DevicePath \"\"" Dec 02 10:29:51 crc kubenswrapper[4781]: I1202 10:29:51.356496 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptckk" event={"ID":"9fedf909-efba-48ff-af33-1c6e957ee3ca","Type":"ContainerDied","Data":"c47e09eb16efab3b06c92e76c2092429c5bdd8456a8a569bc4d4ba13303a7355"} Dec 02 10:29:51 crc kubenswrapper[4781]: I1202 10:29:51.356549 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptckk" Dec 02 10:29:51 crc kubenswrapper[4781]: I1202 10:29:51.356799 4781 scope.go:117] "RemoveContainer" containerID="d0b7fdf0c20468825ca6ca35da31355ff7806e7a64873700a9ab54b80ce3f42e" Dec 02 10:29:51 crc kubenswrapper[4781]: I1202 10:29:51.383744 4781 scope.go:117] "RemoveContainer" containerID="7f14e964a2667110ccdbc819d46816e31b844abf6f551cac6b8e96b17460ac0e" Dec 02 10:29:51 crc kubenswrapper[4781]: I1202 10:29:51.418636 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptckk"] Dec 02 10:29:51 crc kubenswrapper[4781]: I1202 10:29:51.426553 4781 scope.go:117] "RemoveContainer" containerID="eb4109d8ee84d53233be0ebcc451d6d38f9c6f59e1e1b1f4a2037b986555a5eb" Dec 02 10:29:51 crc kubenswrapper[4781]: I1202 10:29:51.435472 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptckk"] Dec 02 10:29:51 crc kubenswrapper[4781]: I1202 10:29:51.511630 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fedf909-efba-48ff-af33-1c6e957ee3ca" path="/var/lib/kubelet/pods/9fedf909-efba-48ff-af33-1c6e957ee3ca/volumes" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.180704 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w"] Dec 02 10:30:00 crc kubenswrapper[4781]: E1202 10:30:00.181747 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fedf909-efba-48ff-af33-1c6e957ee3ca" containerName="extract-utilities" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.181765 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fedf909-efba-48ff-af33-1c6e957ee3ca" containerName="extract-utilities" Dec 02 10:30:00 crc kubenswrapper[4781]: E1202 10:30:00.181785 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fedf909-efba-48ff-af33-1c6e957ee3ca" containerName="extract-content" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.181793 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fedf909-efba-48ff-af33-1c6e957ee3ca" containerName="extract-content" Dec 02 10:30:00 crc kubenswrapper[4781]: E1202 10:30:00.181814 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fedf909-efba-48ff-af33-1c6e957ee3ca" containerName="registry-server" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.181824 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fedf909-efba-48ff-af33-1c6e957ee3ca" containerName="registry-server" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.182124 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fedf909-efba-48ff-af33-1c6e957ee3ca" containerName="registry-server" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.182883 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.184846 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.185103 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.191351 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w"] Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.383886 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/085dee99-0b5a-46dc-90de-fee3575c5c5b-secret-volume\") pod \"collect-profiles-29411190-s4l5w\" (UID: \"085dee99-0b5a-46dc-90de-fee3575c5c5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.384034 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/085dee99-0b5a-46dc-90de-fee3575c5c5b-config-volume\") pod \"collect-profiles-29411190-s4l5w\" (UID: \"085dee99-0b5a-46dc-90de-fee3575c5c5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.384062 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-858f8\" (UniqueName: \"kubernetes.io/projected/085dee99-0b5a-46dc-90de-fee3575c5c5b-kube-api-access-858f8\") pod \"collect-profiles-29411190-s4l5w\" (UID: \"085dee99-0b5a-46dc-90de-fee3575c5c5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.412388 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.412516 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.485347 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/085dee99-0b5a-46dc-90de-fee3575c5c5b-secret-volume\") pod \"collect-profiles-29411190-s4l5w\" (UID: \"085dee99-0b5a-46dc-90de-fee3575c5c5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.485496 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-858f8\" (UniqueName: \"kubernetes.io/projected/085dee99-0b5a-46dc-90de-fee3575c5c5b-kube-api-access-858f8\") pod \"collect-profiles-29411190-s4l5w\" (UID: \"085dee99-0b5a-46dc-90de-fee3575c5c5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.485522 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/085dee99-0b5a-46dc-90de-fee3575c5c5b-config-volume\") pod \"collect-profiles-29411190-s4l5w\" (UID: \"085dee99-0b5a-46dc-90de-fee3575c5c5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.486567 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/085dee99-0b5a-46dc-90de-fee3575c5c5b-config-volume\") pod \"collect-profiles-29411190-s4l5w\" (UID: \"085dee99-0b5a-46dc-90de-fee3575c5c5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.493574 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/085dee99-0b5a-46dc-90de-fee3575c5c5b-secret-volume\") pod \"collect-profiles-29411190-s4l5w\" (UID: \"085dee99-0b5a-46dc-90de-fee3575c5c5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.504743 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-858f8\" (UniqueName: \"kubernetes.io/projected/085dee99-0b5a-46dc-90de-fee3575c5c5b-kube-api-access-858f8\") pod \"collect-profiles-29411190-s4l5w\" (UID: \"085dee99-0b5a-46dc-90de-fee3575c5c5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" Dec 02 10:30:00 crc kubenswrapper[4781]: I1202 10:30:00.805124 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" Dec 02 10:30:01 crc kubenswrapper[4781]: I1202 10:30:01.680334 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w"] Dec 02 10:30:02 crc kubenswrapper[4781]: I1202 10:30:02.479660 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" event={"ID":"085dee99-0b5a-46dc-90de-fee3575c5c5b","Type":"ContainerStarted","Data":"090c67bd9622c24ff3fc8d97be3bb5c92c4855bacc16c45481a2261958af960e"} Dec 02 10:30:02 crc kubenswrapper[4781]: I1202 10:30:02.480033 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" event={"ID":"085dee99-0b5a-46dc-90de-fee3575c5c5b","Type":"ContainerStarted","Data":"24c459c4a9055be090dc6cc43dd8edfd07bee3ebc21bfe9d76b2d2f760193cb7"} Dec 02 10:30:02 crc kubenswrapper[4781]: I1202 10:30:02.499911 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" podStartSLOduration=2.499891833 podStartE2EDuration="2.499891833s" podCreationTimestamp="2025-12-02 10:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 10:30:02.493326596 +0000 UTC m=+4165.317200475" watchObservedRunningTime="2025-12-02 10:30:02.499891833 +0000 UTC m=+4165.323765712" Dec 02 10:30:03 crc kubenswrapper[4781]: I1202 10:30:03.490293 4781 generic.go:334] "Generic (PLEG): container finished" podID="085dee99-0b5a-46dc-90de-fee3575c5c5b" containerID="090c67bd9622c24ff3fc8d97be3bb5c92c4855bacc16c45481a2261958af960e" exitCode=0 Dec 02 10:30:03 crc kubenswrapper[4781]: I1202 10:30:03.490370 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" event={"ID":"085dee99-0b5a-46dc-90de-fee3575c5c5b","Type":"ContainerDied","Data":"090c67bd9622c24ff3fc8d97be3bb5c92c4855bacc16c45481a2261958af960e"} Dec 02 10:30:04 crc kubenswrapper[4781]: I1202 10:30:04.850100 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" Dec 02 10:30:04 crc kubenswrapper[4781]: I1202 10:30:04.966853 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/085dee99-0b5a-46dc-90de-fee3575c5c5b-config-volume\") pod \"085dee99-0b5a-46dc-90de-fee3575c5c5b\" (UID: \"085dee99-0b5a-46dc-90de-fee3575c5c5b\") " Dec 02 10:30:04 crc kubenswrapper[4781]: I1202 10:30:04.967064 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-858f8\" (UniqueName: \"kubernetes.io/projected/085dee99-0b5a-46dc-90de-fee3575c5c5b-kube-api-access-858f8\") pod \"085dee99-0b5a-46dc-90de-fee3575c5c5b\" (UID: \"085dee99-0b5a-46dc-90de-fee3575c5c5b\") " Dec 02 10:30:04 crc kubenswrapper[4781]: I1202 10:30:04.967130 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/085dee99-0b5a-46dc-90de-fee3575c5c5b-secret-volume\") pod \"085dee99-0b5a-46dc-90de-fee3575c5c5b\" (UID: \"085dee99-0b5a-46dc-90de-fee3575c5c5b\") " Dec 02 10:30:04 crc kubenswrapper[4781]: I1202 10:30:04.967850 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/085dee99-0b5a-46dc-90de-fee3575c5c5b-config-volume" (OuterVolumeSpecName: "config-volume") pod "085dee99-0b5a-46dc-90de-fee3575c5c5b" (UID: "085dee99-0b5a-46dc-90de-fee3575c5c5b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 10:30:04 crc kubenswrapper[4781]: I1202 10:30:04.973488 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/085dee99-0b5a-46dc-90de-fee3575c5c5b-kube-api-access-858f8" (OuterVolumeSpecName: "kube-api-access-858f8") pod "085dee99-0b5a-46dc-90de-fee3575c5c5b" (UID: "085dee99-0b5a-46dc-90de-fee3575c5c5b"). InnerVolumeSpecName "kube-api-access-858f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:30:04 crc kubenswrapper[4781]: I1202 10:30:04.973822 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/085dee99-0b5a-46dc-90de-fee3575c5c5b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "085dee99-0b5a-46dc-90de-fee3575c5c5b" (UID: "085dee99-0b5a-46dc-90de-fee3575c5c5b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:30:05 crc kubenswrapper[4781]: I1202 10:30:05.069248 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/085dee99-0b5a-46dc-90de-fee3575c5c5b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 10:30:05 crc kubenswrapper[4781]: I1202 10:30:05.069293 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/085dee99-0b5a-46dc-90de-fee3575c5c5b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 10:30:05 crc kubenswrapper[4781]: I1202 10:30:05.069308 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-858f8\" (UniqueName: \"kubernetes.io/projected/085dee99-0b5a-46dc-90de-fee3575c5c5b-kube-api-access-858f8\") on node \"crc\" DevicePath \"\"" Dec 02 10:30:05 crc kubenswrapper[4781]: I1202 10:30:05.507232 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" Dec 02 10:30:05 crc kubenswrapper[4781]: I1202 10:30:05.508339 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411190-s4l5w" event={"ID":"085dee99-0b5a-46dc-90de-fee3575c5c5b","Type":"ContainerDied","Data":"24c459c4a9055be090dc6cc43dd8edfd07bee3ebc21bfe9d76b2d2f760193cb7"} Dec 02 10:30:05 crc kubenswrapper[4781]: I1202 10:30:05.508377 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24c459c4a9055be090dc6cc43dd8edfd07bee3ebc21bfe9d76b2d2f760193cb7" Dec 02 10:30:05 crc kubenswrapper[4781]: I1202 10:30:05.918011 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb"] Dec 02 10:30:05 crc kubenswrapper[4781]: I1202 10:30:05.930866 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411145-vkbqb"] Dec 02 10:30:07 crc kubenswrapper[4781]: I1202 10:30:07.510743 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3" path="/var/lib/kubelet/pods/bfee3ebe-80f4-4ce5-9168-12d5fdaa65b3/volumes" Dec 02 10:30:30 crc kubenswrapper[4781]: I1202 10:30:30.411510 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:30:30 crc kubenswrapper[4781]: I1202 10:30:30.412127 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:30:45 crc kubenswrapper[4781]: I1202 10:30:45.494501 4781 scope.go:117] "RemoveContainer" containerID="48e33f1a84dc6ab5bd055c0a23da99040bff417919880b46851280f6d42b4395" Dec 02 10:30:45 crc kubenswrapper[4781]: I1202 10:30:45.530261 4781 scope.go:117] "RemoveContainer" containerID="d3c616324f0e8c79eea62ce3082674854684052bf6b68557a0480855e240a9ed" Dec 02 10:31:00 crc kubenswrapper[4781]: I1202 10:31:00.412562 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:31:00 crc kubenswrapper[4781]: I1202 10:31:00.413318 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:31:00 crc kubenswrapper[4781]: I1202 10:31:00.413386 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 10:31:00 crc kubenswrapper[4781]: I1202 10:31:00.414555 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"11b83b3689380a7a0a856185b9c3f2221828cd55a83586474c6d42332e35e9f8"} pod="openshift-machine-config-operator/machine-config-daemon-pzntm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 10:31:00 crc kubenswrapper[4781]: I1202 10:31:00.414658 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" containerID="cri-o://11b83b3689380a7a0a856185b9c3f2221828cd55a83586474c6d42332e35e9f8" gracePeriod=600 Dec 02 10:31:01 crc kubenswrapper[4781]: I1202 10:31:01.048967 4781 generic.go:334] "Generic (PLEG): container finished" podID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerID="11b83b3689380a7a0a856185b9c3f2221828cd55a83586474c6d42332e35e9f8" exitCode=0 Dec 02 10:31:01 crc kubenswrapper[4781]: I1202 10:31:01.048965 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerDied","Data":"11b83b3689380a7a0a856185b9c3f2221828cd55a83586474c6d42332e35e9f8"} Dec 02 10:31:01 crc kubenswrapper[4781]: I1202 10:31:01.049419 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d"} Dec 02 10:31:01 crc kubenswrapper[4781]: I1202 10:31:01.049461 4781 scope.go:117] "RemoveContainer" containerID="3108739d92f37bf346834eaffd456a3eb9cfc18107a661e848468628c375b388" Dec 02 10:32:09 crc kubenswrapper[4781]: I1202 10:32:09.933890 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tk6vq"] Dec 02 10:32:09 crc kubenswrapper[4781]: E1202 10:32:09.939347 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085dee99-0b5a-46dc-90de-fee3575c5c5b" containerName="collect-profiles" Dec 02 10:32:09 crc kubenswrapper[4781]: I1202 10:32:09.939487 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="085dee99-0b5a-46dc-90de-fee3575c5c5b" containerName="collect-profiles" Dec 02 10:32:09 crc kubenswrapper[4781]: I1202 10:32:09.939893 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="085dee99-0b5a-46dc-90de-fee3575c5c5b" containerName="collect-profiles" Dec 02 10:32:09 crc kubenswrapper[4781]: I1202 10:32:09.941962 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tk6vq" Dec 02 10:32:09 crc kubenswrapper[4781]: I1202 10:32:09.970329 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tk6vq"] Dec 02 10:32:10 crc kubenswrapper[4781]: I1202 10:32:10.050489 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4ad9228-b725-47f5-85b8-1d0997517439-utilities\") pod \"community-operators-tk6vq\" (UID: \"f4ad9228-b725-47f5-85b8-1d0997517439\") " pod="openshift-marketplace/community-operators-tk6vq" Dec 02 10:32:10 crc kubenswrapper[4781]: I1202 10:32:10.050580 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4ad9228-b725-47f5-85b8-1d0997517439-catalog-content\") pod \"community-operators-tk6vq\" (UID: \"f4ad9228-b725-47f5-85b8-1d0997517439\") " pod="openshift-marketplace/community-operators-tk6vq" Dec 02 10:32:10 crc kubenswrapper[4781]: I1202 10:32:10.050613 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkqm9\" (UniqueName: \"kubernetes.io/projected/f4ad9228-b725-47f5-85b8-1d0997517439-kube-api-access-jkqm9\") pod \"community-operators-tk6vq\" (UID: \"f4ad9228-b725-47f5-85b8-1d0997517439\") " pod="openshift-marketplace/community-operators-tk6vq" Dec 02 10:32:10 crc kubenswrapper[4781]: I1202 10:32:10.151878 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4ad9228-b725-47f5-85b8-1d0997517439-utilities\") pod \"community-operators-tk6vq\" (UID: \"f4ad9228-b725-47f5-85b8-1d0997517439\") " pod="openshift-marketplace/community-operators-tk6vq" Dec 02 10:32:10 crc kubenswrapper[4781]: I1202 10:32:10.151969 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4ad9228-b725-47f5-85b8-1d0997517439-catalog-content\") pod \"community-operators-tk6vq\" (UID: \"f4ad9228-b725-47f5-85b8-1d0997517439\") " pod="openshift-marketplace/community-operators-tk6vq" Dec 02 10:32:10 crc kubenswrapper[4781]: I1202 10:32:10.151999 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkqm9\" (UniqueName: \"kubernetes.io/projected/f4ad9228-b725-47f5-85b8-1d0997517439-kube-api-access-jkqm9\") pod \"community-operators-tk6vq\" (UID: \"f4ad9228-b725-47f5-85b8-1d0997517439\") " pod="openshift-marketplace/community-operators-tk6vq" Dec 02 10:32:10 crc kubenswrapper[4781]: I1202 10:32:10.152767 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4ad9228-b725-47f5-85b8-1d0997517439-catalog-content\") pod \"community-operators-tk6vq\" (UID: \"f4ad9228-b725-47f5-85b8-1d0997517439\") " pod="openshift-marketplace/community-operators-tk6vq" Dec 02 10:32:10 crc kubenswrapper[4781]: I1202 10:32:10.152962 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4ad9228-b725-47f5-85b8-1d0997517439-utilities\") pod \"community-operators-tk6vq\" (UID: \"f4ad9228-b725-47f5-85b8-1d0997517439\") " pod="openshift-marketplace/community-operators-tk6vq" Dec 02 10:32:10 crc kubenswrapper[4781]: I1202 10:32:10.174577 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkqm9\" (UniqueName: \"kubernetes.io/projected/f4ad9228-b725-47f5-85b8-1d0997517439-kube-api-access-jkqm9\") pod \"community-operators-tk6vq\" (UID: \"f4ad9228-b725-47f5-85b8-1d0997517439\") " pod="openshift-marketplace/community-operators-tk6vq" Dec 02 10:32:10 crc kubenswrapper[4781]: I1202 10:32:10.280341 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tk6vq" Dec 02 10:32:10 crc kubenswrapper[4781]: I1202 10:32:10.700602 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tk6vq"] Dec 02 10:32:10 crc kubenswrapper[4781]: I1202 10:32:10.800288 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tk6vq" event={"ID":"f4ad9228-b725-47f5-85b8-1d0997517439","Type":"ContainerStarted","Data":"ed6a00dcd35bb0ac73f9cdd7605ad063538847226c005645aaa9e431c2de2d90"} Dec 02 10:32:11 crc kubenswrapper[4781]: I1202 10:32:11.809222 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4ad9228-b725-47f5-85b8-1d0997517439" containerID="b16d9db7e800b974c53f47e83ea26d5fecbc6eb0842ef094e654c56636ff5a96" exitCode=0 Dec 02 10:32:11 crc kubenswrapper[4781]: I1202 10:32:11.809331 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tk6vq" event={"ID":"f4ad9228-b725-47f5-85b8-1d0997517439","Type":"ContainerDied","Data":"b16d9db7e800b974c53f47e83ea26d5fecbc6eb0842ef094e654c56636ff5a96"} Dec 02 10:32:13 crc kubenswrapper[4781]: I1202 10:32:13.827824 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4ad9228-b725-47f5-85b8-1d0997517439" containerID="552d17bc3ff41b09ae83efcda97bac0c8d6c4be0b00ac75ddab331b1a11750ca" exitCode=0 Dec 02 10:32:13 crc kubenswrapper[4781]: I1202 10:32:13.827886 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tk6vq" event={"ID":"f4ad9228-b725-47f5-85b8-1d0997517439","Type":"ContainerDied","Data":"552d17bc3ff41b09ae83efcda97bac0c8d6c4be0b00ac75ddab331b1a11750ca"} Dec 02 10:32:16 crc kubenswrapper[4781]: I1202 10:32:16.874177 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tk6vq" event={"ID":"f4ad9228-b725-47f5-85b8-1d0997517439","Type":"ContainerStarted","Data":"beb8b000358ef7e2ebef84cf3a9040bec00c7557c5913dbb7464a81ec81f453d"} Dec 02 10:32:20 crc kubenswrapper[4781]: I1202 10:32:20.281463 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tk6vq" Dec 02 10:32:20 crc kubenswrapper[4781]: I1202 10:32:20.282127 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tk6vq" Dec 02 10:32:20 crc kubenswrapper[4781]: I1202 10:32:20.389195 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tk6vq" Dec 02 10:32:20 crc kubenswrapper[4781]: I1202 10:32:20.419635 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tk6vq" podStartSLOduration=6.861026118 podStartE2EDuration="11.419613192s" podCreationTimestamp="2025-12-02 10:32:09 +0000 UTC" firstStartedPulling="2025-12-02 10:32:11.810888025 +0000 UTC m=+4294.634761904" lastFinishedPulling="2025-12-02 10:32:16.369475079 +0000 UTC m=+4299.193348978" observedRunningTime="2025-12-02 10:32:16.903371287 +0000 UTC m=+4299.727245166" watchObservedRunningTime="2025-12-02 10:32:20.419613192 +0000 UTC m=+4303.243487071" Dec 02 10:32:30 crc kubenswrapper[4781]: I1202 10:32:30.338396 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tk6vq" Dec 02 10:32:30 crc kubenswrapper[4781]: I1202 10:32:30.380298 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tk6vq"] Dec 02 10:32:31 crc kubenswrapper[4781]: I1202 10:32:31.016672 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tk6vq" podUID="f4ad9228-b725-47f5-85b8-1d0997517439" containerName="registry-server" containerID="cri-o://beb8b000358ef7e2ebef84cf3a9040bec00c7557c5913dbb7464a81ec81f453d" gracePeriod=2 Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.009416 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tk6vq" Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.026256 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4ad9228-b725-47f5-85b8-1d0997517439" containerID="beb8b000358ef7e2ebef84cf3a9040bec00c7557c5913dbb7464a81ec81f453d" exitCode=0 Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.026291 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tk6vq" event={"ID":"f4ad9228-b725-47f5-85b8-1d0997517439","Type":"ContainerDied","Data":"beb8b000358ef7e2ebef84cf3a9040bec00c7557c5913dbb7464a81ec81f453d"} Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.026317 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tk6vq" event={"ID":"f4ad9228-b725-47f5-85b8-1d0997517439","Type":"ContainerDied","Data":"ed6a00dcd35bb0ac73f9cdd7605ad063538847226c005645aaa9e431c2de2d90"} Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.026335 4781 scope.go:117] "RemoveContainer" containerID="beb8b000358ef7e2ebef84cf3a9040bec00c7557c5913dbb7464a81ec81f453d" Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.026480 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tk6vq" Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.061522 4781 scope.go:117] "RemoveContainer" containerID="552d17bc3ff41b09ae83efcda97bac0c8d6c4be0b00ac75ddab331b1a11750ca" Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.092529 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkqm9\" (UniqueName: \"kubernetes.io/projected/f4ad9228-b725-47f5-85b8-1d0997517439-kube-api-access-jkqm9\") pod \"f4ad9228-b725-47f5-85b8-1d0997517439\" (UID: \"f4ad9228-b725-47f5-85b8-1d0997517439\") " Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.092597 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4ad9228-b725-47f5-85b8-1d0997517439-catalog-content\") pod \"f4ad9228-b725-47f5-85b8-1d0997517439\" (UID: \"f4ad9228-b725-47f5-85b8-1d0997517439\") " Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.092647 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4ad9228-b725-47f5-85b8-1d0997517439-utilities\") pod \"f4ad9228-b725-47f5-85b8-1d0997517439\" (UID: \"f4ad9228-b725-47f5-85b8-1d0997517439\") " Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.093604 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4ad9228-b725-47f5-85b8-1d0997517439-utilities" (OuterVolumeSpecName: "utilities") pod "f4ad9228-b725-47f5-85b8-1d0997517439" (UID: "f4ad9228-b725-47f5-85b8-1d0997517439"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.095893 4781 scope.go:117] "RemoveContainer" containerID="b16d9db7e800b974c53f47e83ea26d5fecbc6eb0842ef094e654c56636ff5a96" Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.099149 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4ad9228-b725-47f5-85b8-1d0997517439-kube-api-access-jkqm9" (OuterVolumeSpecName: "kube-api-access-jkqm9") pod "f4ad9228-b725-47f5-85b8-1d0997517439" (UID: "f4ad9228-b725-47f5-85b8-1d0997517439"). InnerVolumeSpecName "kube-api-access-jkqm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.145882 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4ad9228-b725-47f5-85b8-1d0997517439-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4ad9228-b725-47f5-85b8-1d0997517439" (UID: "f4ad9228-b725-47f5-85b8-1d0997517439"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.194438 4781 scope.go:117] "RemoveContainer" containerID="beb8b000358ef7e2ebef84cf3a9040bec00c7557c5913dbb7464a81ec81f453d" Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.194482 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkqm9\" (UniqueName: \"kubernetes.io/projected/f4ad9228-b725-47f5-85b8-1d0997517439-kube-api-access-jkqm9\") on node \"crc\" DevicePath \"\"" Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.194508 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4ad9228-b725-47f5-85b8-1d0997517439-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.194517 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4ad9228-b725-47f5-85b8-1d0997517439-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:32:32 crc kubenswrapper[4781]: E1202 10:32:32.195374 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beb8b000358ef7e2ebef84cf3a9040bec00c7557c5913dbb7464a81ec81f453d\": container with ID starting with beb8b000358ef7e2ebef84cf3a9040bec00c7557c5913dbb7464a81ec81f453d not found: ID does not exist" containerID="beb8b000358ef7e2ebef84cf3a9040bec00c7557c5913dbb7464a81ec81f453d" Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.195422 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb8b000358ef7e2ebef84cf3a9040bec00c7557c5913dbb7464a81ec81f453d"} err="failed to get container status \"beb8b000358ef7e2ebef84cf3a9040bec00c7557c5913dbb7464a81ec81f453d\": rpc error: code = NotFound desc = could not find container \"beb8b000358ef7e2ebef84cf3a9040bec00c7557c5913dbb7464a81ec81f453d\": container with ID starting with beb8b000358ef7e2ebef84cf3a9040bec00c7557c5913dbb7464a81ec81f453d not found: ID does not exist" Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.195458 4781 scope.go:117] "RemoveContainer" containerID="552d17bc3ff41b09ae83efcda97bac0c8d6c4be0b00ac75ddab331b1a11750ca" Dec 02 10:32:32 crc kubenswrapper[4781]: E1202 10:32:32.195786 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"552d17bc3ff41b09ae83efcda97bac0c8d6c4be0b00ac75ddab331b1a11750ca\": container with ID starting with 552d17bc3ff41b09ae83efcda97bac0c8d6c4be0b00ac75ddab331b1a11750ca not found: ID does not exist" containerID="552d17bc3ff41b09ae83efcda97bac0c8d6c4be0b00ac75ddab331b1a11750ca" Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.195813 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552d17bc3ff41b09ae83efcda97bac0c8d6c4be0b00ac75ddab331b1a11750ca"} err="failed to get container status \"552d17bc3ff41b09ae83efcda97bac0c8d6c4be0b00ac75ddab331b1a11750ca\": rpc error: code = NotFound desc = could not find container \"552d17bc3ff41b09ae83efcda97bac0c8d6c4be0b00ac75ddab331b1a11750ca\": container with ID starting with 552d17bc3ff41b09ae83efcda97bac0c8d6c4be0b00ac75ddab331b1a11750ca not found: ID does not exist" Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.195835 4781 scope.go:117] "RemoveContainer" containerID="b16d9db7e800b974c53f47e83ea26d5fecbc6eb0842ef094e654c56636ff5a96" Dec 02 10:32:32 crc kubenswrapper[4781]: E1202 10:32:32.196221 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b16d9db7e800b974c53f47e83ea26d5fecbc6eb0842ef094e654c56636ff5a96\": container with ID starting with b16d9db7e800b974c53f47e83ea26d5fecbc6eb0842ef094e654c56636ff5a96 not found: ID does not exist" containerID="b16d9db7e800b974c53f47e83ea26d5fecbc6eb0842ef094e654c56636ff5a96" Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.196248 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b16d9db7e800b974c53f47e83ea26d5fecbc6eb0842ef094e654c56636ff5a96"} err="failed to get container status \"b16d9db7e800b974c53f47e83ea26d5fecbc6eb0842ef094e654c56636ff5a96\": rpc error: code = NotFound desc = could not find container \"b16d9db7e800b974c53f47e83ea26d5fecbc6eb0842ef094e654c56636ff5a96\": container with ID starting with b16d9db7e800b974c53f47e83ea26d5fecbc6eb0842ef094e654c56636ff5a96 not found: ID does not exist" Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.359822 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tk6vq"] Dec 02 10:32:32 crc kubenswrapper[4781]: I1202 10:32:32.368561 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tk6vq"] Dec 02 10:32:33 crc kubenswrapper[4781]: I1202 10:32:33.529866 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4ad9228-b725-47f5-85b8-1d0997517439" path="/var/lib/kubelet/pods/f4ad9228-b725-47f5-85b8-1d0997517439/volumes" Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.256091 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kzm46/must-gather-np56k"] Dec 02 10:32:38 crc kubenswrapper[4781]: E1202 10:32:38.257139 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4ad9228-b725-47f5-85b8-1d0997517439" containerName="extract-content" Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.257157 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4ad9228-b725-47f5-85b8-1d0997517439" containerName="extract-content" Dec 02 10:32:38 crc kubenswrapper[4781]: E1202 10:32:38.257180 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4ad9228-b725-47f5-85b8-1d0997517439" containerName="extract-utilities" Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.257189 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4ad9228-b725-47f5-85b8-1d0997517439" containerName="extract-utilities" Dec 02 10:32:38 crc kubenswrapper[4781]: E1202 10:32:38.257206 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4ad9228-b725-47f5-85b8-1d0997517439" containerName="registry-server" Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.257215 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4ad9228-b725-47f5-85b8-1d0997517439" containerName="registry-server" Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.257398 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4ad9228-b725-47f5-85b8-1d0997517439" containerName="registry-server" Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.258513 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzm46/must-gather-np56k" Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.264858 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kzm46"/"default-dockercfg-qxtk6" Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.265430 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kzm46"/"openshift-service-ca.crt" Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.266138 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kzm46"/"kube-root-ca.crt" Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.291782 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kzm46/must-gather-np56k"] Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.411982 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtv4w\" (UniqueName: \"kubernetes.io/projected/07e367c2-2f0f-43fd-9ab4-85a99e1a8291-kube-api-access-qtv4w\") pod \"must-gather-np56k\" (UID: \"07e367c2-2f0f-43fd-9ab4-85a99e1a8291\") " pod="openshift-must-gather-kzm46/must-gather-np56k" Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.412203 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/07e367c2-2f0f-43fd-9ab4-85a99e1a8291-must-gather-output\") pod \"must-gather-np56k\" (UID: \"07e367c2-2f0f-43fd-9ab4-85a99e1a8291\") " pod="openshift-must-gather-kzm46/must-gather-np56k" Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.514155 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/07e367c2-2f0f-43fd-9ab4-85a99e1a8291-must-gather-output\") pod \"must-gather-np56k\" (UID: \"07e367c2-2f0f-43fd-9ab4-85a99e1a8291\") " pod="openshift-must-gather-kzm46/must-gather-np56k" Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.514272 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtv4w\" (UniqueName: \"kubernetes.io/projected/07e367c2-2f0f-43fd-9ab4-85a99e1a8291-kube-api-access-qtv4w\") pod \"must-gather-np56k\" (UID: \"07e367c2-2f0f-43fd-9ab4-85a99e1a8291\") " pod="openshift-must-gather-kzm46/must-gather-np56k" Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.515410 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/07e367c2-2f0f-43fd-9ab4-85a99e1a8291-must-gather-output\") pod \"must-gather-np56k\" (UID: \"07e367c2-2f0f-43fd-9ab4-85a99e1a8291\") " pod="openshift-must-gather-kzm46/must-gather-np56k" Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.533128 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtv4w\" (UniqueName: \"kubernetes.io/projected/07e367c2-2f0f-43fd-9ab4-85a99e1a8291-kube-api-access-qtv4w\") pod \"must-gather-np56k\" (UID: \"07e367c2-2f0f-43fd-9ab4-85a99e1a8291\") " pod="openshift-must-gather-kzm46/must-gather-np56k" Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.591963 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzm46/must-gather-np56k" Dec 02 10:32:38 crc kubenswrapper[4781]: I1202 10:32:38.886558 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kzm46/must-gather-np56k"] Dec 02 10:32:39 crc kubenswrapper[4781]: I1202 10:32:39.112259 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzm46/must-gather-np56k" event={"ID":"07e367c2-2f0f-43fd-9ab4-85a99e1a8291","Type":"ContainerStarted","Data":"6b233d1d98b7a915fb9638bdc09db6105591226f6a528bd3ded97bfe473de249"} Dec 02 10:32:40 crc kubenswrapper[4781]: I1202 10:32:40.121870 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzm46/must-gather-np56k" event={"ID":"07e367c2-2f0f-43fd-9ab4-85a99e1a8291","Type":"ContainerStarted","Data":"3dd819b81667c1642a1945a786b9713fc3ac0751d43bc6583513971a8b1fd5ed"} Dec 02 10:32:40 crc kubenswrapper[4781]: I1202 10:32:40.122398 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzm46/must-gather-np56k" event={"ID":"07e367c2-2f0f-43fd-9ab4-85a99e1a8291","Type":"ContainerStarted","Data":"adda2635d32a58b0257b7e70e101b06664d2436348ba286e7a10cad95f00fe48"} Dec 02 10:32:40 crc kubenswrapper[4781]: I1202 10:32:40.148659 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kzm46/must-gather-np56k" podStartSLOduration=2.148630185 podStartE2EDuration="2.148630185s" podCreationTimestamp="2025-12-02 10:32:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 10:32:40.140347512 +0000 UTC m=+4322.964221411" watchObservedRunningTime="2025-12-02 10:32:40.148630185 +0000 UTC m=+4322.972504064" Dec 02 10:32:42 crc kubenswrapper[4781]: I1202 10:32:42.711803 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kzm46/crc-debug-vnvlh"] Dec 02 10:32:42 crc kubenswrapper[4781]: I1202 10:32:42.713790 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzm46/crc-debug-vnvlh" Dec 02 10:32:42 crc kubenswrapper[4781]: I1202 10:32:42.805377 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbvld\" (UniqueName: \"kubernetes.io/projected/24a4c313-cdee-481f-8646-412d2361d1b8-kube-api-access-tbvld\") pod \"crc-debug-vnvlh\" (UID: \"24a4c313-cdee-481f-8646-412d2361d1b8\") " pod="openshift-must-gather-kzm46/crc-debug-vnvlh" Dec 02 10:32:42 crc kubenswrapper[4781]: I1202 10:32:42.805576 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24a4c313-cdee-481f-8646-412d2361d1b8-host\") pod \"crc-debug-vnvlh\" (UID: \"24a4c313-cdee-481f-8646-412d2361d1b8\") " pod="openshift-must-gather-kzm46/crc-debug-vnvlh" Dec 02 10:32:42 crc kubenswrapper[4781]: I1202 10:32:42.907172 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24a4c313-cdee-481f-8646-412d2361d1b8-host\") pod \"crc-debug-vnvlh\" (UID: \"24a4c313-cdee-481f-8646-412d2361d1b8\") " pod="openshift-must-gather-kzm46/crc-debug-vnvlh" Dec 02 10:32:42 crc kubenswrapper[4781]: I1202 10:32:42.907269 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbvld\" (UniqueName: \"kubernetes.io/projected/24a4c313-cdee-481f-8646-412d2361d1b8-kube-api-access-tbvld\") pod \"crc-debug-vnvlh\" (UID: \"24a4c313-cdee-481f-8646-412d2361d1b8\") " pod="openshift-must-gather-kzm46/crc-debug-vnvlh" Dec 02 10:32:42 crc kubenswrapper[4781]: I1202 10:32:42.907318 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24a4c313-cdee-481f-8646-412d2361d1b8-host\") pod \"crc-debug-vnvlh\" (UID: \"24a4c313-cdee-481f-8646-412d2361d1b8\") " pod="openshift-must-gather-kzm46/crc-debug-vnvlh" Dec 02 10:32:43 crc kubenswrapper[4781]: I1202 10:32:43.274448 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbvld\" (UniqueName: \"kubernetes.io/projected/24a4c313-cdee-481f-8646-412d2361d1b8-kube-api-access-tbvld\") pod \"crc-debug-vnvlh\" (UID: \"24a4c313-cdee-481f-8646-412d2361d1b8\") " pod="openshift-must-gather-kzm46/crc-debug-vnvlh" Dec 02 10:32:43 crc kubenswrapper[4781]: I1202 10:32:43.330264 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzm46/crc-debug-vnvlh" Dec 02 10:32:43 crc kubenswrapper[4781]: W1202 10:32:43.400059 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24a4c313_cdee_481f_8646_412d2361d1b8.slice/crio-a3a522ced8479b766185d8d0fd16a17e1a0fc96166da930c852bb3264c391df9 WatchSource:0}: Error finding container a3a522ced8479b766185d8d0fd16a17e1a0fc96166da930c852bb3264c391df9: Status 404 returned error can't find the container with id a3a522ced8479b766185d8d0fd16a17e1a0fc96166da930c852bb3264c391df9 Dec 02 10:32:44 crc kubenswrapper[4781]: I1202 10:32:44.167562 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzm46/crc-debug-vnvlh" event={"ID":"24a4c313-cdee-481f-8646-412d2361d1b8","Type":"ContainerStarted","Data":"dac398c50ac049691dc28dc23624df04aa7454b9dbcdec82a425b9c0d0d4c64c"} Dec 02 10:32:44 crc kubenswrapper[4781]: I1202 10:32:44.168182 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzm46/crc-debug-vnvlh" event={"ID":"24a4c313-cdee-481f-8646-412d2361d1b8","Type":"ContainerStarted","Data":"a3a522ced8479b766185d8d0fd16a17e1a0fc96166da930c852bb3264c391df9"} Dec 02 10:32:44 crc kubenswrapper[4781]: I1202 10:32:44.187805 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kzm46/crc-debug-vnvlh" podStartSLOduration=2.18778214 podStartE2EDuration="2.18778214s" podCreationTimestamp="2025-12-02 10:32:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 10:32:44.18035168 +0000 UTC m=+4327.004225549" watchObservedRunningTime="2025-12-02 10:32:44.18778214 +0000 UTC m=+4327.011656029" Dec 02 10:32:51 crc kubenswrapper[4781]: I1202 10:32:51.035032 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tj5hr"] Dec 02 10:32:51 crc kubenswrapper[4781]: I1202 10:32:51.039981 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tj5hr" Dec 02 10:32:51 crc kubenswrapper[4781]: I1202 10:32:51.052051 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tj5hr"] Dec 02 10:32:51 crc kubenswrapper[4781]: I1202 10:32:51.166845 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-catalog-content\") pod \"certified-operators-tj5hr\" (UID: \"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23\") " pod="openshift-marketplace/certified-operators-tj5hr" Dec 02 10:32:51 crc kubenswrapper[4781]: I1202 10:32:51.167255 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-utilities\") pod \"certified-operators-tj5hr\" (UID: \"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23\") " pod="openshift-marketplace/certified-operators-tj5hr" Dec 02 10:32:51 crc kubenswrapper[4781]: I1202 10:32:51.167449 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8zbc\" (UniqueName: \"kubernetes.io/projected/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-kube-api-access-f8zbc\") pod \"certified-operators-tj5hr\" (UID: \"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23\") " pod="openshift-marketplace/certified-operators-tj5hr" Dec 02 10:32:51 crc kubenswrapper[4781]: I1202 10:32:51.269295 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-catalog-content\") pod \"certified-operators-tj5hr\" (UID: \"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23\") " pod="openshift-marketplace/certified-operators-tj5hr" Dec 02 10:32:51 crc kubenswrapper[4781]: I1202 10:32:51.269361 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-utilities\") pod \"certified-operators-tj5hr\" (UID: \"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23\") " pod="openshift-marketplace/certified-operators-tj5hr" Dec 02 10:32:51 crc kubenswrapper[4781]: I1202 10:32:51.269455 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8zbc\" (UniqueName: \"kubernetes.io/projected/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-kube-api-access-f8zbc\") pod \"certified-operators-tj5hr\" (UID: \"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23\") " pod="openshift-marketplace/certified-operators-tj5hr" Dec 02 10:32:51 crc kubenswrapper[4781]: I1202 10:32:51.270161 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-catalog-content\") pod \"certified-operators-tj5hr\" (UID: \"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23\") " pod="openshift-marketplace/certified-operators-tj5hr" Dec 02 10:32:51 crc kubenswrapper[4781]: I1202 10:32:51.270241 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-utilities\") pod \"certified-operators-tj5hr\" (UID: \"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23\") " pod="openshift-marketplace/certified-operators-tj5hr" Dec 02 10:32:51 crc kubenswrapper[4781]: I1202 10:32:51.297165 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8zbc\" (UniqueName: \"kubernetes.io/projected/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-kube-api-access-f8zbc\") pod \"certified-operators-tj5hr\" (UID: \"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23\") " pod="openshift-marketplace/certified-operators-tj5hr" Dec 02 10:32:51 crc kubenswrapper[4781]: I1202 10:32:51.365775 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tj5hr" Dec 02 10:32:51 crc kubenswrapper[4781]: I1202 10:32:51.904628 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tj5hr"] Dec 02 10:32:52 crc kubenswrapper[4781]: I1202 10:32:52.428159 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mm5c8"] Dec 02 10:32:52 crc kubenswrapper[4781]: I1202 10:32:52.430265 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mm5c8" Dec 02 10:32:52 crc kubenswrapper[4781]: I1202 10:32:52.439449 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mm5c8"] Dec 02 10:32:52 crc kubenswrapper[4781]: I1202 10:32:52.508023 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-catalog-content\") pod \"redhat-operators-mm5c8\" (UID: \"3f3827f0-855c-4ca8-aa5e-6980428d0a0e\") " pod="openshift-marketplace/redhat-operators-mm5c8" Dec 02 10:32:52 crc kubenswrapper[4781]: I1202 10:32:52.508279 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwltq\" (UniqueName: \"kubernetes.io/projected/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-kube-api-access-pwltq\") pod \"redhat-operators-mm5c8\" (UID: \"3f3827f0-855c-4ca8-aa5e-6980428d0a0e\") " pod="openshift-marketplace/redhat-operators-mm5c8" Dec 02 10:32:52 crc kubenswrapper[4781]: I1202 10:32:52.508342 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-utilities\") pod \"redhat-operators-mm5c8\" (UID: \"3f3827f0-855c-4ca8-aa5e-6980428d0a0e\") " pod="openshift-marketplace/redhat-operators-mm5c8" Dec 02 10:32:52 crc kubenswrapper[4781]: I1202 10:32:52.610811 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-catalog-content\") pod \"redhat-operators-mm5c8\" (UID: \"3f3827f0-855c-4ca8-aa5e-6980428d0a0e\") " pod="openshift-marketplace/redhat-operators-mm5c8" Dec 02 10:32:52 crc kubenswrapper[4781]: I1202 10:32:52.610846 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwltq\" (UniqueName: \"kubernetes.io/projected/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-kube-api-access-pwltq\") pod \"redhat-operators-mm5c8\" (UID: \"3f3827f0-855c-4ca8-aa5e-6980428d0a0e\") " pod="openshift-marketplace/redhat-operators-mm5c8" Dec 02 10:32:52 crc kubenswrapper[4781]: I1202 10:32:52.610962 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-utilities\") pod \"redhat-operators-mm5c8\" (UID: \"3f3827f0-855c-4ca8-aa5e-6980428d0a0e\") " pod="openshift-marketplace/redhat-operators-mm5c8" Dec 02 10:32:52 crc kubenswrapper[4781]: I1202 10:32:52.611490 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-catalog-content\") pod \"redhat-operators-mm5c8\" (UID: \"3f3827f0-855c-4ca8-aa5e-6980428d0a0e\") " pod="openshift-marketplace/redhat-operators-mm5c8" Dec 02 10:32:52 crc kubenswrapper[4781]: I1202 10:32:52.613090 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-utilities\") pod \"redhat-operators-mm5c8\" (UID: \"3f3827f0-855c-4ca8-aa5e-6980428d0a0e\") " pod="openshift-marketplace/redhat-operators-mm5c8" Dec 02 10:32:52 crc kubenswrapper[4781]: I1202 10:32:52.637648 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwltq\" (UniqueName: \"kubernetes.io/projected/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-kube-api-access-pwltq\") pod \"redhat-operators-mm5c8\" (UID: \"3f3827f0-855c-4ca8-aa5e-6980428d0a0e\") " pod="openshift-marketplace/redhat-operators-mm5c8" Dec 02 10:32:52 crc kubenswrapper[4781]: I1202 10:32:52.787681 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mm5c8" Dec 02 10:32:53 crc kubenswrapper[4781]: I1202 10:32:53.248600 4781 generic.go:334] "Generic (PLEG): container finished" podID="bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23" containerID="dcdce0f4fba44e43d3313c9840bf84b00dd18ba41b0e1a0a9d334d311dae5291" exitCode=0 Dec 02 10:32:53 crc kubenswrapper[4781]: I1202 10:32:53.248840 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj5hr" event={"ID":"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23","Type":"ContainerDied","Data":"dcdce0f4fba44e43d3313c9840bf84b00dd18ba41b0e1a0a9d334d311dae5291"} Dec 02 10:32:53 crc kubenswrapper[4781]: I1202 10:32:53.248906 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj5hr" event={"ID":"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23","Type":"ContainerStarted","Data":"36137a75d681f9ce7ae51afac45066f2737a6723b56cec0fb2585415184b0b0c"} Dec 02 10:32:53 crc kubenswrapper[4781]: I1202 10:32:53.339303 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mm5c8"] Dec 02 10:32:54 crc kubenswrapper[4781]: I1202 10:32:54.258414 4781 generic.go:334] "Generic (PLEG): container finished" podID="3f3827f0-855c-4ca8-aa5e-6980428d0a0e" containerID="9c5ec460cf71114de6e448e54f6e58a5d382894624d3420431bbb547cfdf35b0" exitCode=0 Dec 02 10:32:54 crc kubenswrapper[4781]: I1202 10:32:54.258579 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mm5c8" event={"ID":"3f3827f0-855c-4ca8-aa5e-6980428d0a0e","Type":"ContainerDied","Data":"9c5ec460cf71114de6e448e54f6e58a5d382894624d3420431bbb547cfdf35b0"} Dec 02 10:32:54 crc kubenswrapper[4781]: I1202 10:32:54.259000 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mm5c8" event={"ID":"3f3827f0-855c-4ca8-aa5e-6980428d0a0e","Type":"ContainerStarted","Data":"ee94ab4bccd933fb3a0930ada49cfac9ccb78889ea04d02b3c6a06bd86cbbc5b"} Dec 02 10:32:56 crc kubenswrapper[4781]: I1202 10:32:56.286413 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mm5c8" event={"ID":"3f3827f0-855c-4ca8-aa5e-6980428d0a0e","Type":"ContainerStarted","Data":"064a7ada73bf48c89698097e89613297327238c1e2321fb8829fafc36fd92979"} Dec 02 10:32:56 crc kubenswrapper[4781]: I1202 10:32:56.289866 4781 generic.go:334] "Generic (PLEG): container finished" podID="bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23" containerID="f862231c94b7a9498bb12337ef2cb660a5248c5cdce2580313ae63e6e2fa192d" exitCode=0 Dec 02 10:32:56 crc kubenswrapper[4781]: I1202 10:32:56.289900 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj5hr" event={"ID":"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23","Type":"ContainerDied","Data":"f862231c94b7a9498bb12337ef2cb660a5248c5cdce2580313ae63e6e2fa192d"} Dec 02 10:32:57 crc kubenswrapper[4781]: I1202 10:32:57.300729 4781 generic.go:334] "Generic (PLEG): container finished" podID="3f3827f0-855c-4ca8-aa5e-6980428d0a0e" containerID="064a7ada73bf48c89698097e89613297327238c1e2321fb8829fafc36fd92979" exitCode=0 Dec 02 10:32:57 crc kubenswrapper[4781]: I1202 10:32:57.300828 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mm5c8" event={"ID":"3f3827f0-855c-4ca8-aa5e-6980428d0a0e","Type":"ContainerDied","Data":"064a7ada73bf48c89698097e89613297327238c1e2321fb8829fafc36fd92979"} Dec 02 10:33:00 crc kubenswrapper[4781]: I1202 10:33:00.412680 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:33:00 crc kubenswrapper[4781]: I1202 10:33:00.413262 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:33:03 crc kubenswrapper[4781]: I1202 10:33:03.358081 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj5hr" event={"ID":"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23","Type":"ContainerStarted","Data":"35df8a95e8ef0b603d62421bbff2dea188cc292bad04348d5bc4f24bb19d09b8"} Dec 02 10:33:03 crc kubenswrapper[4781]: I1202 10:33:03.360813 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mm5c8" event={"ID":"3f3827f0-855c-4ca8-aa5e-6980428d0a0e","Type":"ContainerStarted","Data":"eec307586ca594010007c08abb0d52dcdaab60f241a9d1721a211da7a372c8e3"} Dec 02 10:33:03 crc kubenswrapper[4781]: I1202 10:33:03.395142 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tj5hr" podStartSLOduration=7.272681636 podStartE2EDuration="12.395122685s" podCreationTimestamp="2025-12-02 10:32:51 +0000 UTC" firstStartedPulling="2025-12-02 10:32:53.277387578 +0000 UTC m=+4336.101261457" lastFinishedPulling="2025-12-02 10:32:58.399828627 +0000 UTC m=+4341.223702506" observedRunningTime="2025-12-02 10:33:03.387396036 +0000 UTC m=+4346.211269925" watchObservedRunningTime="2025-12-02 10:33:03.395122685 +0000 UTC m=+4346.218996564" Dec 02 10:33:03 crc kubenswrapper[4781]: I1202 10:33:03.411576 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mm5c8" podStartSLOduration=3.005154613 podStartE2EDuration="11.411549416s" podCreationTimestamp="2025-12-02 10:32:52 +0000 UTC" firstStartedPulling="2025-12-02 10:32:54.260454694 +0000 UTC m=+4337.084328573" lastFinishedPulling="2025-12-02 10:33:02.666849497 +0000 UTC m=+4345.490723376" observedRunningTime="2025-12-02 10:33:03.406384228 +0000 UTC m=+4346.230258127" watchObservedRunningTime="2025-12-02 10:33:03.411549416 +0000 UTC m=+4346.235423295" Dec 02 10:33:11 crc kubenswrapper[4781]: I1202 10:33:11.366321 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tj5hr" Dec 02 10:33:11 crc kubenswrapper[4781]: I1202 10:33:11.366869 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tj5hr" Dec 02 10:33:11 crc kubenswrapper[4781]: I1202 10:33:11.420617 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tj5hr" Dec 02 10:33:11 crc kubenswrapper[4781]: I1202 10:33:11.489981 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tj5hr" Dec 02 10:33:11 crc kubenswrapper[4781]: I1202 10:33:11.656394 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tj5hr"] Dec 02 10:33:12 crc kubenswrapper[4781]: I1202 10:33:12.788325 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mm5c8" Dec 02 10:33:12 crc kubenswrapper[4781]: I1202 10:33:12.788371 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mm5c8" Dec 02 10:33:12 crc kubenswrapper[4781]: I1202 10:33:12.847039 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mm5c8" Dec 02 10:33:13 crc kubenswrapper[4781]: I1202 10:33:13.443815 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tj5hr" podUID="bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23" containerName="registry-server" containerID="cri-o://35df8a95e8ef0b603d62421bbff2dea188cc292bad04348d5bc4f24bb19d09b8" gracePeriod=2 Dec 02 10:33:13 crc kubenswrapper[4781]: I1202 10:33:13.490639 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mm5c8" Dec 02 10:33:14 crc kubenswrapper[4781]: I1202 10:33:14.459030 4781 generic.go:334] "Generic (PLEG): container finished" podID="bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23" containerID="35df8a95e8ef0b603d62421bbff2dea188cc292bad04348d5bc4f24bb19d09b8" exitCode=0 Dec 02 10:33:14 crc kubenswrapper[4781]: I1202 10:33:14.459097 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj5hr" event={"ID":"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23","Type":"ContainerDied","Data":"35df8a95e8ef0b603d62421bbff2dea188cc292bad04348d5bc4f24bb19d09b8"} Dec 02 10:33:14 crc kubenswrapper[4781]: I1202 10:33:14.856897 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mm5c8"] Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.082971 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tj5hr" Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.175192 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8zbc\" (UniqueName: \"kubernetes.io/projected/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-kube-api-access-f8zbc\") pod \"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23\" (UID: \"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23\") " Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.175457 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-catalog-content\") pod \"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23\" (UID: \"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23\") " Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.175560 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-utilities\") pod \"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23\" (UID: \"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23\") " Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.176379 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-utilities" (OuterVolumeSpecName: "utilities") pod "bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23" (UID: "bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.209178 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-kube-api-access-f8zbc" (OuterVolumeSpecName: "kube-api-access-f8zbc") pod "bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23" (UID: "bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23"). InnerVolumeSpecName "kube-api-access-f8zbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.246309 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23" (UID: "bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.277944 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.277974 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.277984 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8zbc\" (UniqueName: \"kubernetes.io/projected/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23-kube-api-access-f8zbc\") on node \"crc\" DevicePath \"\"" Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.480777 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tj5hr" Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.480838 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj5hr" event={"ID":"bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23","Type":"ContainerDied","Data":"36137a75d681f9ce7ae51afac45066f2737a6723b56cec0fb2585415184b0b0c"} Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.480876 4781 scope.go:117] "RemoveContainer" containerID="35df8a95e8ef0b603d62421bbff2dea188cc292bad04348d5bc4f24bb19d09b8" Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.480904 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mm5c8" podUID="3f3827f0-855c-4ca8-aa5e-6980428d0a0e" containerName="registry-server" containerID="cri-o://eec307586ca594010007c08abb0d52dcdaab60f241a9d1721a211da7a372c8e3" gracePeriod=2 Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.513001 4781 scope.go:117] "RemoveContainer" containerID="f862231c94b7a9498bb12337ef2cb660a5248c5cdce2580313ae63e6e2fa192d" Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.528881 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tj5hr"] Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.534892 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tj5hr"] Dec 02 10:33:15 crc kubenswrapper[4781]: I1202 10:33:15.551180 4781 scope.go:117] "RemoveContainer" containerID="dcdce0f4fba44e43d3313c9840bf84b00dd18ba41b0e1a0a9d334d311dae5291" Dec 02 10:33:16 crc kubenswrapper[4781]: I1202 10:33:16.501109 4781 generic.go:334] "Generic (PLEG): container finished" podID="3f3827f0-855c-4ca8-aa5e-6980428d0a0e" containerID="eec307586ca594010007c08abb0d52dcdaab60f241a9d1721a211da7a372c8e3" exitCode=0 Dec 02 10:33:16 crc kubenswrapper[4781]: I1202 10:33:16.501187 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mm5c8" event={"ID":"3f3827f0-855c-4ca8-aa5e-6980428d0a0e","Type":"ContainerDied","Data":"eec307586ca594010007c08abb0d52dcdaab60f241a9d1721a211da7a372c8e3"} Dec 02 10:33:16 crc kubenswrapper[4781]: I1202 10:33:16.991384 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mm5c8" Dec 02 10:33:17 crc kubenswrapper[4781]: I1202 10:33:17.109456 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-utilities\") pod \"3f3827f0-855c-4ca8-aa5e-6980428d0a0e\" (UID: \"3f3827f0-855c-4ca8-aa5e-6980428d0a0e\") " Dec 02 10:33:17 crc kubenswrapper[4781]: I1202 10:33:17.109521 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-catalog-content\") pod \"3f3827f0-855c-4ca8-aa5e-6980428d0a0e\" (UID: \"3f3827f0-855c-4ca8-aa5e-6980428d0a0e\") " Dec 02 10:33:17 crc kubenswrapper[4781]: I1202 10:33:17.109571 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwltq\" (UniqueName: \"kubernetes.io/projected/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-kube-api-access-pwltq\") pod \"3f3827f0-855c-4ca8-aa5e-6980428d0a0e\" (UID: \"3f3827f0-855c-4ca8-aa5e-6980428d0a0e\") " Dec 02 10:33:17 crc kubenswrapper[4781]: I1202 10:33:17.110818 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-utilities" (OuterVolumeSpecName: "utilities") pod "3f3827f0-855c-4ca8-aa5e-6980428d0a0e" (UID: "3f3827f0-855c-4ca8-aa5e-6980428d0a0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:33:17 crc kubenswrapper[4781]: I1202 10:33:17.117160 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-kube-api-access-pwltq" (OuterVolumeSpecName: "kube-api-access-pwltq") pod "3f3827f0-855c-4ca8-aa5e-6980428d0a0e" (UID: "3f3827f0-855c-4ca8-aa5e-6980428d0a0e"). InnerVolumeSpecName "kube-api-access-pwltq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:33:17 crc kubenswrapper[4781]: I1202 10:33:17.212480 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:33:17 crc kubenswrapper[4781]: I1202 10:33:17.212519 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwltq\" (UniqueName: \"kubernetes.io/projected/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-kube-api-access-pwltq\") on node \"crc\" DevicePath \"\"" Dec 02 10:33:17 crc kubenswrapper[4781]: I1202 10:33:17.217220 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f3827f0-855c-4ca8-aa5e-6980428d0a0e" (UID: "3f3827f0-855c-4ca8-aa5e-6980428d0a0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:33:17 crc kubenswrapper[4781]: I1202 10:33:17.315873 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f3827f0-855c-4ca8-aa5e-6980428d0a0e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:33:17 crc kubenswrapper[4781]: I1202 10:33:17.517115 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23" path="/var/lib/kubelet/pods/bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23/volumes" Dec 02 10:33:17 crc kubenswrapper[4781]: I1202 10:33:17.519703 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mm5c8" event={"ID":"3f3827f0-855c-4ca8-aa5e-6980428d0a0e","Type":"ContainerDied","Data":"ee94ab4bccd933fb3a0930ada49cfac9ccb78889ea04d02b3c6a06bd86cbbc5b"} Dec 02 10:33:17 crc kubenswrapper[4781]: I1202 10:33:17.519758 4781 scope.go:117] "RemoveContainer" containerID="eec307586ca594010007c08abb0d52dcdaab60f241a9d1721a211da7a372c8e3" Dec 02 10:33:17 crc kubenswrapper[4781]: I1202 10:33:17.519878 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mm5c8" Dec 02 10:33:17 crc kubenswrapper[4781]: I1202 10:33:17.563090 4781 scope.go:117] "RemoveContainer" containerID="064a7ada73bf48c89698097e89613297327238c1e2321fb8829fafc36fd92979" Dec 02 10:33:17 crc kubenswrapper[4781]: I1202 10:33:17.580237 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mm5c8"] Dec 02 10:33:17 crc kubenswrapper[4781]: I1202 10:33:17.586735 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mm5c8"] Dec 02 10:33:17 crc kubenswrapper[4781]: I1202 10:33:17.594200 4781 scope.go:117] "RemoveContainer" containerID="9c5ec460cf71114de6e448e54f6e58a5d382894624d3420431bbb547cfdf35b0" Dec 02 10:33:19 crc kubenswrapper[4781]: I1202 10:33:19.512129 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f3827f0-855c-4ca8-aa5e-6980428d0a0e" path="/var/lib/kubelet/pods/3f3827f0-855c-4ca8-aa5e-6980428d0a0e/volumes" Dec 02 10:33:26 crc kubenswrapper[4781]: I1202 10:33:26.605667 4781 generic.go:334] "Generic (PLEG): container finished" podID="24a4c313-cdee-481f-8646-412d2361d1b8" containerID="dac398c50ac049691dc28dc23624df04aa7454b9dbcdec82a425b9c0d0d4c64c" exitCode=0 Dec 02 10:33:26 crc kubenswrapper[4781]: I1202 10:33:26.606209 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzm46/crc-debug-vnvlh" event={"ID":"24a4c313-cdee-481f-8646-412d2361d1b8","Type":"ContainerDied","Data":"dac398c50ac049691dc28dc23624df04aa7454b9dbcdec82a425b9c0d0d4c64c"} Dec 02 10:33:27 crc kubenswrapper[4781]: I1202 10:33:27.751063 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzm46/crc-debug-vnvlh" Dec 02 10:33:27 crc kubenswrapper[4781]: I1202 10:33:27.794228 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kzm46/crc-debug-vnvlh"] Dec 02 10:33:27 crc kubenswrapper[4781]: I1202 10:33:27.802252 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kzm46/crc-debug-vnvlh"] Dec 02 10:33:27 crc kubenswrapper[4781]: I1202 10:33:27.810020 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbvld\" (UniqueName: \"kubernetes.io/projected/24a4c313-cdee-481f-8646-412d2361d1b8-kube-api-access-tbvld\") pod \"24a4c313-cdee-481f-8646-412d2361d1b8\" (UID: \"24a4c313-cdee-481f-8646-412d2361d1b8\") " Dec 02 10:33:27 crc kubenswrapper[4781]: I1202 10:33:27.810186 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24a4c313-cdee-481f-8646-412d2361d1b8-host\") pod \"24a4c313-cdee-481f-8646-412d2361d1b8\" (UID: \"24a4c313-cdee-481f-8646-412d2361d1b8\") " Dec 02 10:33:27 crc kubenswrapper[4781]: I1202 10:33:27.810350 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a4c313-cdee-481f-8646-412d2361d1b8-host" (OuterVolumeSpecName: "host") pod "24a4c313-cdee-481f-8646-412d2361d1b8" (UID: "24a4c313-cdee-481f-8646-412d2361d1b8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 10:33:27 crc kubenswrapper[4781]: I1202 10:33:27.810749 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24a4c313-cdee-481f-8646-412d2361d1b8-host\") on node \"crc\" DevicePath \"\"" Dec 02 10:33:27 crc kubenswrapper[4781]: I1202 10:33:27.817133 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a4c313-cdee-481f-8646-412d2361d1b8-kube-api-access-tbvld" (OuterVolumeSpecName: "kube-api-access-tbvld") pod "24a4c313-cdee-481f-8646-412d2361d1b8" (UID: "24a4c313-cdee-481f-8646-412d2361d1b8"). InnerVolumeSpecName "kube-api-access-tbvld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:33:27 crc kubenswrapper[4781]: I1202 10:33:27.912633 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbvld\" (UniqueName: \"kubernetes.io/projected/24a4c313-cdee-481f-8646-412d2361d1b8-kube-api-access-tbvld\") on node \"crc\" DevicePath \"\"" Dec 02 10:33:28 crc kubenswrapper[4781]: I1202 10:33:28.624052 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3a522ced8479b766185d8d0fd16a17e1a0fc96166da930c852bb3264c391df9" Dec 02 10:33:28 crc kubenswrapper[4781]: I1202 10:33:28.624139 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzm46/crc-debug-vnvlh" Dec 02 10:33:28 crc kubenswrapper[4781]: I1202 10:33:28.940751 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kzm46/crc-debug-cj77v"] Dec 02 10:33:28 crc kubenswrapper[4781]: E1202 10:33:28.941110 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a4c313-cdee-481f-8646-412d2361d1b8" containerName="container-00" Dec 02 10:33:28 crc kubenswrapper[4781]: I1202 10:33:28.941122 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a4c313-cdee-481f-8646-412d2361d1b8" containerName="container-00" Dec 02 10:33:28 crc kubenswrapper[4781]: E1202 10:33:28.941132 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f3827f0-855c-4ca8-aa5e-6980428d0a0e" containerName="extract-content" Dec 02 10:33:28 crc kubenswrapper[4781]: I1202 10:33:28.941138 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f3827f0-855c-4ca8-aa5e-6980428d0a0e" containerName="extract-content" Dec 02 10:33:28 crc kubenswrapper[4781]: E1202 10:33:28.941160 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23" containerName="registry-server" Dec 02 10:33:28 crc kubenswrapper[4781]: I1202 10:33:28.941166 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23" containerName="registry-server" Dec 02 10:33:28 crc kubenswrapper[4781]: E1202 10:33:28.941187 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23" containerName="extract-utilities" Dec 02 10:33:28 crc kubenswrapper[4781]: I1202 10:33:28.941194 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23" containerName="extract-utilities" Dec 02 10:33:28 crc kubenswrapper[4781]: E1202 10:33:28.941207 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f3827f0-855c-4ca8-aa5e-6980428d0a0e" containerName="registry-server" Dec 02 10:33:28 crc kubenswrapper[4781]: I1202 10:33:28.941212 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f3827f0-855c-4ca8-aa5e-6980428d0a0e" containerName="registry-server" Dec 02 10:33:28 crc kubenswrapper[4781]: E1202 10:33:28.941220 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23" containerName="extract-content" Dec 02 10:33:28 crc kubenswrapper[4781]: I1202 10:33:28.941225 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23" containerName="extract-content" Dec 02 10:33:28 crc kubenswrapper[4781]: E1202 10:33:28.941244 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f3827f0-855c-4ca8-aa5e-6980428d0a0e" containerName="extract-utilities" Dec 02 10:33:28 crc kubenswrapper[4781]: I1202 10:33:28.941250 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f3827f0-855c-4ca8-aa5e-6980428d0a0e" containerName="extract-utilities" Dec 02 10:33:28 crc kubenswrapper[4781]: I1202 10:33:28.941446 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a4c313-cdee-481f-8646-412d2361d1b8" containerName="container-00" Dec 02 10:33:28 crc kubenswrapper[4781]: I1202 10:33:28.941459 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe1f97d-9ffd-4ea9-95b5-7a0cbfcadf23" containerName="registry-server" Dec 02 10:33:28 crc kubenswrapper[4781]: I1202 10:33:28.941477 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f3827f0-855c-4ca8-aa5e-6980428d0a0e" containerName="registry-server" Dec 02 10:33:28 crc kubenswrapper[4781]: I1202 10:33:28.942094 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzm46/crc-debug-cj77v" Dec 02 10:33:29 crc kubenswrapper[4781]: I1202 10:33:29.051050 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6b2677e5-4887-4d5d-ab7f-82f8a1a0e638-host\") pod \"crc-debug-cj77v\" (UID: \"6b2677e5-4887-4d5d-ab7f-82f8a1a0e638\") " pod="openshift-must-gather-kzm46/crc-debug-cj77v" Dec 02 10:33:29 crc kubenswrapper[4781]: I1202 10:33:29.051104 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz7hh\" (UniqueName: \"kubernetes.io/projected/6b2677e5-4887-4d5d-ab7f-82f8a1a0e638-kube-api-access-vz7hh\") pod \"crc-debug-cj77v\" (UID: \"6b2677e5-4887-4d5d-ab7f-82f8a1a0e638\") " pod="openshift-must-gather-kzm46/crc-debug-cj77v" Dec 02 10:33:29 crc kubenswrapper[4781]: I1202 10:33:29.152800 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6b2677e5-4887-4d5d-ab7f-82f8a1a0e638-host\") pod \"crc-debug-cj77v\" (UID: \"6b2677e5-4887-4d5d-ab7f-82f8a1a0e638\") " pod="openshift-must-gather-kzm46/crc-debug-cj77v" Dec 02 10:33:29 crc kubenswrapper[4781]: I1202 10:33:29.152892 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz7hh\" (UniqueName: \"kubernetes.io/projected/6b2677e5-4887-4d5d-ab7f-82f8a1a0e638-kube-api-access-vz7hh\") pod \"crc-debug-cj77v\" (UID: \"6b2677e5-4887-4d5d-ab7f-82f8a1a0e638\") " pod="openshift-must-gather-kzm46/crc-debug-cj77v" Dec 02 10:33:29 crc kubenswrapper[4781]: I1202 10:33:29.152996 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6b2677e5-4887-4d5d-ab7f-82f8a1a0e638-host\") pod \"crc-debug-cj77v\" (UID: \"6b2677e5-4887-4d5d-ab7f-82f8a1a0e638\") " pod="openshift-must-gather-kzm46/crc-debug-cj77v" Dec 02 10:33:29 crc kubenswrapper[4781]: I1202 10:33:29.171892 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz7hh\" (UniqueName: \"kubernetes.io/projected/6b2677e5-4887-4d5d-ab7f-82f8a1a0e638-kube-api-access-vz7hh\") pod \"crc-debug-cj77v\" (UID: \"6b2677e5-4887-4d5d-ab7f-82f8a1a0e638\") " pod="openshift-must-gather-kzm46/crc-debug-cj77v" Dec 02 10:33:29 crc kubenswrapper[4781]: I1202 10:33:29.265350 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzm46/crc-debug-cj77v" Dec 02 10:33:29 crc kubenswrapper[4781]: I1202 10:33:29.512477 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a4c313-cdee-481f-8646-412d2361d1b8" path="/var/lib/kubelet/pods/24a4c313-cdee-481f-8646-412d2361d1b8/volumes" Dec 02 10:33:29 crc kubenswrapper[4781]: I1202 10:33:29.633176 4781 generic.go:334] "Generic (PLEG): container finished" podID="6b2677e5-4887-4d5d-ab7f-82f8a1a0e638" containerID="993eee06b8d87835b48c7f84b5e19680ba6927b62de860bcee0dd80f61ab1526" exitCode=0 Dec 02 10:33:29 crc kubenswrapper[4781]: I1202 10:33:29.633362 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzm46/crc-debug-cj77v" event={"ID":"6b2677e5-4887-4d5d-ab7f-82f8a1a0e638","Type":"ContainerDied","Data":"993eee06b8d87835b48c7f84b5e19680ba6927b62de860bcee0dd80f61ab1526"} Dec 02 10:33:29 crc kubenswrapper[4781]: I1202 10:33:29.633486 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzm46/crc-debug-cj77v" event={"ID":"6b2677e5-4887-4d5d-ab7f-82f8a1a0e638","Type":"ContainerStarted","Data":"7aea6484658beaa9207ee371ca01eec092f96a5afb4a7ab8ac7901f9b61d8516"} Dec 02 10:33:30 crc kubenswrapper[4781]: I1202 10:33:30.095228 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kzm46/crc-debug-cj77v"] Dec 02 10:33:30 crc kubenswrapper[4781]: I1202 10:33:30.105066 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kzm46/crc-debug-cj77v"] Dec 02 10:33:30 crc kubenswrapper[4781]: I1202 10:33:30.411691 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:33:30 crc kubenswrapper[4781]: I1202 10:33:30.411752 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:33:30 crc kubenswrapper[4781]: I1202 10:33:30.735340 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzm46/crc-debug-cj77v" Dec 02 10:33:30 crc kubenswrapper[4781]: I1202 10:33:30.782194 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz7hh\" (UniqueName: \"kubernetes.io/projected/6b2677e5-4887-4d5d-ab7f-82f8a1a0e638-kube-api-access-vz7hh\") pod \"6b2677e5-4887-4d5d-ab7f-82f8a1a0e638\" (UID: \"6b2677e5-4887-4d5d-ab7f-82f8a1a0e638\") " Dec 02 10:33:30 crc kubenswrapper[4781]: I1202 10:33:30.782290 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6b2677e5-4887-4d5d-ab7f-82f8a1a0e638-host\") pod \"6b2677e5-4887-4d5d-ab7f-82f8a1a0e638\" (UID: \"6b2677e5-4887-4d5d-ab7f-82f8a1a0e638\") " Dec 02 10:33:30 crc kubenswrapper[4781]: I1202 10:33:30.782494 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b2677e5-4887-4d5d-ab7f-82f8a1a0e638-host" (OuterVolumeSpecName: "host") pod "6b2677e5-4887-4d5d-ab7f-82f8a1a0e638" (UID: "6b2677e5-4887-4d5d-ab7f-82f8a1a0e638"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 10:33:30 crc kubenswrapper[4781]: I1202 10:33:30.782977 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6b2677e5-4887-4d5d-ab7f-82f8a1a0e638-host\") on node \"crc\" DevicePath \"\"" Dec 02 10:33:30 crc kubenswrapper[4781]: I1202 10:33:30.803920 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b2677e5-4887-4d5d-ab7f-82f8a1a0e638-kube-api-access-vz7hh" (OuterVolumeSpecName: "kube-api-access-vz7hh") pod "6b2677e5-4887-4d5d-ab7f-82f8a1a0e638" (UID: "6b2677e5-4887-4d5d-ab7f-82f8a1a0e638"). InnerVolumeSpecName "kube-api-access-vz7hh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:33:30 crc kubenswrapper[4781]: I1202 10:33:30.884765 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz7hh\" (UniqueName: \"kubernetes.io/projected/6b2677e5-4887-4d5d-ab7f-82f8a1a0e638-kube-api-access-vz7hh\") on node \"crc\" DevicePath \"\"" Dec 02 10:33:31 crc kubenswrapper[4781]: I1202 10:33:31.343598 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kzm46/crc-debug-f8rxv"] Dec 02 10:33:31 crc kubenswrapper[4781]: E1202 10:33:31.343946 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2677e5-4887-4d5d-ab7f-82f8a1a0e638" containerName="container-00" Dec 02 10:33:31 crc kubenswrapper[4781]: I1202 10:33:31.343957 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2677e5-4887-4d5d-ab7f-82f8a1a0e638" containerName="container-00" Dec 02 10:33:31 crc kubenswrapper[4781]: I1202 10:33:31.344170 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b2677e5-4887-4d5d-ab7f-82f8a1a0e638" containerName="container-00" Dec 02 10:33:31 crc kubenswrapper[4781]: I1202 10:33:31.344705 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzm46/crc-debug-f8rxv" Dec 02 10:33:31 crc kubenswrapper[4781]: I1202 10:33:31.392393 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pblzf\" (UniqueName: \"kubernetes.io/projected/c0134d7a-ea89-4e89-9765-4cf24591ef1a-kube-api-access-pblzf\") pod \"crc-debug-f8rxv\" (UID: \"c0134d7a-ea89-4e89-9765-4cf24591ef1a\") " pod="openshift-must-gather-kzm46/crc-debug-f8rxv" Dec 02 10:33:31 crc kubenswrapper[4781]: I1202 10:33:31.392533 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0134d7a-ea89-4e89-9765-4cf24591ef1a-host\") pod \"crc-debug-f8rxv\" (UID: \"c0134d7a-ea89-4e89-9765-4cf24591ef1a\") " pod="openshift-must-gather-kzm46/crc-debug-f8rxv" Dec 02 10:33:31 crc kubenswrapper[4781]: I1202 10:33:31.494159 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pblzf\" (UniqueName: \"kubernetes.io/projected/c0134d7a-ea89-4e89-9765-4cf24591ef1a-kube-api-access-pblzf\") pod \"crc-debug-f8rxv\" (UID: \"c0134d7a-ea89-4e89-9765-4cf24591ef1a\") " pod="openshift-must-gather-kzm46/crc-debug-f8rxv" Dec 02 10:33:31 crc kubenswrapper[4781]: I1202 10:33:31.494358 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0134d7a-ea89-4e89-9765-4cf24591ef1a-host\") pod \"crc-debug-f8rxv\" (UID: \"c0134d7a-ea89-4e89-9765-4cf24591ef1a\") " pod="openshift-must-gather-kzm46/crc-debug-f8rxv" Dec 02 10:33:31 crc kubenswrapper[4781]: I1202 10:33:31.494474 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0134d7a-ea89-4e89-9765-4cf24591ef1a-host\") pod \"crc-debug-f8rxv\" (UID: \"c0134d7a-ea89-4e89-9765-4cf24591ef1a\") " pod="openshift-must-gather-kzm46/crc-debug-f8rxv" Dec 02 10:33:31 crc kubenswrapper[4781]: I1202 10:33:31.513657 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b2677e5-4887-4d5d-ab7f-82f8a1a0e638" path="/var/lib/kubelet/pods/6b2677e5-4887-4d5d-ab7f-82f8a1a0e638/volumes" Dec 02 10:33:31 crc kubenswrapper[4781]: I1202 10:33:31.515126 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pblzf\" (UniqueName: \"kubernetes.io/projected/c0134d7a-ea89-4e89-9765-4cf24591ef1a-kube-api-access-pblzf\") pod \"crc-debug-f8rxv\" (UID: \"c0134d7a-ea89-4e89-9765-4cf24591ef1a\") " pod="openshift-must-gather-kzm46/crc-debug-f8rxv" Dec 02 10:33:31 crc kubenswrapper[4781]: I1202 10:33:31.650057 4781 scope.go:117] "RemoveContainer" containerID="993eee06b8d87835b48c7f84b5e19680ba6927b62de860bcee0dd80f61ab1526" Dec 02 10:33:31 crc kubenswrapper[4781]: I1202 10:33:31.650183 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzm46/crc-debug-cj77v" Dec 02 10:33:31 crc kubenswrapper[4781]: I1202 10:33:31.660901 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzm46/crc-debug-f8rxv" Dec 02 10:33:32 crc kubenswrapper[4781]: I1202 10:33:32.672337 4781 generic.go:334] "Generic (PLEG): container finished" podID="c0134d7a-ea89-4e89-9765-4cf24591ef1a" containerID="e33023d3e542947612200c96c201a120f0ad5ec7c45395a898e52b864aa91b44" exitCode=0 Dec 02 10:33:32 crc kubenswrapper[4781]: I1202 10:33:32.672409 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzm46/crc-debug-f8rxv" event={"ID":"c0134d7a-ea89-4e89-9765-4cf24591ef1a","Type":"ContainerDied","Data":"e33023d3e542947612200c96c201a120f0ad5ec7c45395a898e52b864aa91b44"} Dec 02 10:33:32 crc kubenswrapper[4781]: I1202 10:33:32.672861 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzm46/crc-debug-f8rxv" event={"ID":"c0134d7a-ea89-4e89-9765-4cf24591ef1a","Type":"ContainerStarted","Data":"966456d26c8345fd19d06b040f61a40275d502d5be8726e92984e1b3eade7fc3"} Dec 02 10:33:33 crc kubenswrapper[4781]: I1202 10:33:33.061332 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kzm46/crc-debug-f8rxv"] Dec 02 10:33:33 crc kubenswrapper[4781]: I1202 10:33:33.069811 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kzm46/crc-debug-f8rxv"] Dec 02 10:33:33 crc kubenswrapper[4781]: I1202 10:33:33.783300 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzm46/crc-debug-f8rxv" Dec 02 10:33:33 crc kubenswrapper[4781]: I1202 10:33:33.837504 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0134d7a-ea89-4e89-9765-4cf24591ef1a-host\") pod \"c0134d7a-ea89-4e89-9765-4cf24591ef1a\" (UID: \"c0134d7a-ea89-4e89-9765-4cf24591ef1a\") " Dec 02 10:33:33 crc kubenswrapper[4781]: I1202 10:33:33.837609 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0134d7a-ea89-4e89-9765-4cf24591ef1a-host" (OuterVolumeSpecName: "host") pod "c0134d7a-ea89-4e89-9765-4cf24591ef1a" (UID: "c0134d7a-ea89-4e89-9765-4cf24591ef1a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 10:33:33 crc kubenswrapper[4781]: I1202 10:33:33.837677 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pblzf\" (UniqueName: \"kubernetes.io/projected/c0134d7a-ea89-4e89-9765-4cf24591ef1a-kube-api-access-pblzf\") pod \"c0134d7a-ea89-4e89-9765-4cf24591ef1a\" (UID: \"c0134d7a-ea89-4e89-9765-4cf24591ef1a\") " Dec 02 10:33:33 crc kubenswrapper[4781]: I1202 10:33:33.838234 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0134d7a-ea89-4e89-9765-4cf24591ef1a-host\") on node \"crc\" DevicePath \"\"" Dec 02 10:33:33 crc kubenswrapper[4781]: I1202 10:33:33.844737 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0134d7a-ea89-4e89-9765-4cf24591ef1a-kube-api-access-pblzf" (OuterVolumeSpecName: "kube-api-access-pblzf") pod "c0134d7a-ea89-4e89-9765-4cf24591ef1a" (UID: "c0134d7a-ea89-4e89-9765-4cf24591ef1a"). InnerVolumeSpecName "kube-api-access-pblzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:33:33 crc kubenswrapper[4781]: I1202 10:33:33.940405 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pblzf\" (UniqueName: \"kubernetes.io/projected/c0134d7a-ea89-4e89-9765-4cf24591ef1a-kube-api-access-pblzf\") on node \"crc\" DevicePath \"\"" Dec 02 10:33:34 crc kubenswrapper[4781]: I1202 10:33:34.707233 4781 scope.go:117] "RemoveContainer" containerID="e33023d3e542947612200c96c201a120f0ad5ec7c45395a898e52b864aa91b44" Dec 02 10:33:34 crc kubenswrapper[4781]: I1202 10:33:34.707307 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzm46/crc-debug-f8rxv" Dec 02 10:33:35 crc kubenswrapper[4781]: I1202 10:33:35.510141 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0134d7a-ea89-4e89-9765-4cf24591ef1a" path="/var/lib/kubelet/pods/c0134d7a-ea89-4e89-9765-4cf24591ef1a/volumes" Dec 02 10:33:55 crc kubenswrapper[4781]: I1202 10:33:55.933353 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-f677cfc78-nzcxt_17137e34-c042-4c3b-b11b-3e743e2a00b5/barbican-api/0.log" Dec 02 10:33:56 crc kubenswrapper[4781]: I1202 10:33:56.130035 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-f677cfc78-nzcxt_17137e34-c042-4c3b-b11b-3e743e2a00b5/barbican-api-log/0.log" Dec 02 10:33:56 crc kubenswrapper[4781]: I1202 10:33:56.185376 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d776c757d-qgkmw_f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1/barbican-keystone-listener/0.log" Dec 02 10:33:56 crc kubenswrapper[4781]: I1202 10:33:56.211093 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d776c757d-qgkmw_f86e03c1-1ef5-4d0a-9875-7bd123ebe7a1/barbican-keystone-listener-log/0.log" Dec 02 10:33:56 crc kubenswrapper[4781]: I1202 10:33:56.380946 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65bbfbf867-4wxpf_bac97a41-a2f4-46ee-b48c-216aeee03abc/barbican-worker/0.log" Dec 02 10:33:56 crc kubenswrapper[4781]: I1202 10:33:56.384301 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65bbfbf867-4wxpf_bac97a41-a2f4-46ee-b48c-216aeee03abc/barbican-worker-log/0.log" Dec 02 10:33:56 crc kubenswrapper[4781]: I1202 10:33:56.589243 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-k8ch4_ceabe2f6-ee27-456b-9031-9ebc39e032eb/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:33:56 crc kubenswrapper[4781]: I1202 10:33:56.630053 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bbf6f862-1ee5-4bc3-83e5-71c1f72d526c/ceilometer-central-agent/0.log" Dec 02 10:33:56 crc kubenswrapper[4781]: I1202 10:33:56.692904 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bbf6f862-1ee5-4bc3-83e5-71c1f72d526c/ceilometer-notification-agent/0.log" Dec 02 10:33:56 crc kubenswrapper[4781]: I1202 10:33:56.824571 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bbf6f862-1ee5-4bc3-83e5-71c1f72d526c/proxy-httpd/0.log" Dec 02 10:33:56 crc kubenswrapper[4781]: I1202 10:33:56.848426 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bbf6f862-1ee5-4bc3-83e5-71c1f72d526c/sg-core/0.log" Dec 02 10:33:56 crc kubenswrapper[4781]: I1202 10:33:56.922808 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4e2d3890-4772-476e-9850-fdb32111b87a/cinder-api/0.log" Dec 02 10:33:56 crc kubenswrapper[4781]: I1202 10:33:56.978039 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4e2d3890-4772-476e-9850-fdb32111b87a/cinder-api-log/0.log" Dec 02 10:33:57 crc kubenswrapper[4781]: I1202 10:33:57.107534 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8457fc90-04ad-45ed-b898-ddf4d7b645b4/cinder-scheduler/0.log" Dec 02 10:33:57 crc kubenswrapper[4781]: I1202 10:33:57.167948 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8457fc90-04ad-45ed-b898-ddf4d7b645b4/probe/0.log" Dec 02 10:33:57 crc kubenswrapper[4781]: I1202 10:33:57.279573 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-95445_321f18fc-759a-4eb7-bbb0-f230b7002932/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:33:57 crc kubenswrapper[4781]: I1202 10:33:57.378705 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-p6f2z_0fcef48b-9cfd-4f26-9964-3a083b035119/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:33:57 crc kubenswrapper[4781]: I1202 10:33:57.470675 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-x5657_6d69f627-f361-4060-91c8-dc5763a00f16/init/0.log" Dec 02 10:33:57 crc kubenswrapper[4781]: I1202 10:33:57.671035 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-x5657_6d69f627-f361-4060-91c8-dc5763a00f16/init/0.log" Dec 02 10:33:57 crc kubenswrapper[4781]: I1202 10:33:57.688891 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-x5657_6d69f627-f361-4060-91c8-dc5763a00f16/dnsmasq-dns/0.log" Dec 02 10:33:57 crc kubenswrapper[4781]: I1202 10:33:57.697264 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-wbgtf_8804a3f3-f23a-4a85-8e45-9f92f90c5e9b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:33:57 crc kubenswrapper[4781]: I1202 10:33:57.872702 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34/glance-log/0.log" Dec 02 10:33:57 crc kubenswrapper[4781]: I1202 10:33:57.929961 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2af07abf-a3b3-4b3a-b1d3-627ed1c2ef34/glance-httpd/0.log" Dec 02 10:33:58 crc kubenswrapper[4781]: I1202 10:33:58.079298 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5e057772-e9bc-4fce-90c9-be91978362fe/glance-log/0.log" Dec 02 10:33:58 crc kubenswrapper[4781]: I1202 10:33:58.084517 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5e057772-e9bc-4fce-90c9-be91978362fe/glance-httpd/0.log" Dec 02 10:33:58 crc kubenswrapper[4781]: I1202 10:33:58.279739 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6856678494-4cprv_226317c7-a6f4-43c5-a3df-c9cb18b3afa5/horizon/0.log" Dec 02 10:33:58 crc kubenswrapper[4781]: I1202 10:33:58.401981 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-ptd7j_82e18c11-6a85-45d3-8794-c7d7d02aaa2d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:33:58 crc kubenswrapper[4781]: I1202 10:33:58.553181 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-2p8f8_ea0d0dc8-72c3-42de-92b5-a98ad0417f6d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:33:58 crc kubenswrapper[4781]: I1202 10:33:58.611125 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6856678494-4cprv_226317c7-a6f4-43c5-a3df-c9cb18b3afa5/horizon-log/0.log" Dec 02 10:33:58 crc kubenswrapper[4781]: I1202 10:33:58.793521 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cbfc4ddfb-kljg5_e2f1c0db-2cf8-4e49-b1cf-8cb27f997927/keystone-api/0.log" Dec 02 10:33:58 crc kubenswrapper[4781]: I1202 10:33:58.813883 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29411161-pl6b8_da3b7144-110c-45af-a358-804809a89670/keystone-cron/0.log" Dec 02 10:33:58 crc kubenswrapper[4781]: I1202 10:33:58.910892 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_2546f353-d520-44ea-8040-c41223665f1f/kube-state-metrics/0.log" Dec 02 10:33:59 crc kubenswrapper[4781]: I1202 10:33:59.023105 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-s8tx9_8db18e94-bcea-4e3a-8759-65fb8084cd43/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:33:59 crc kubenswrapper[4781]: I1202 10:33:59.285964 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-9cbdc4d89-pkh64_f074040a-1272-492e-b149-3a0a6cc89efd/neutron-httpd/0.log" Dec 02 10:33:59 crc kubenswrapper[4781]: I1202 10:33:59.346850 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-9cbdc4d89-pkh64_f074040a-1272-492e-b149-3a0a6cc89efd/neutron-api/0.log" Dec 02 10:33:59 crc kubenswrapper[4781]: I1202 10:33:59.394384 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-5ktkd_c2171979-7791-4850-a4cf-99ac7e62d054/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:33:59 crc kubenswrapper[4781]: I1202 10:33:59.917035 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_18fa078a-6d45-40cf-a39e-139d84f86f76/nova-api-log/0.log" Dec 02 10:34:00 crc kubenswrapper[4781]: I1202 10:34:00.029857 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_56b3a068-08e6-4567-b09b-4050ad8f1a65/nova-cell0-conductor-conductor/0.log" Dec 02 10:34:00 crc kubenswrapper[4781]: I1202 10:34:00.356995 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_cde74fe6-799b-4da8-974d-3fefd2af69aa/nova-cell1-conductor-conductor/0.log" Dec 02 10:34:00 crc kubenswrapper[4781]: I1202 10:34:00.375316 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_18fa078a-6d45-40cf-a39e-139d84f86f76/nova-api-api/0.log" Dec 02 10:34:00 crc kubenswrapper[4781]: I1202 10:34:00.411615 4781 patch_prober.go:28] interesting pod/machine-config-daemon-pzntm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:34:00 crc kubenswrapper[4781]: I1202 10:34:00.411688 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:34:00 crc kubenswrapper[4781]: I1202 10:34:00.411737 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" Dec 02 10:34:00 crc kubenswrapper[4781]: I1202 10:34:00.412517 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d"} pod="openshift-machine-config-operator/machine-config-daemon-pzntm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 10:34:00 crc kubenswrapper[4781]: I1202 10:34:00.412589 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerName="machine-config-daemon" containerID="cri-o://b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" gracePeriod=600 Dec 02 10:34:00 crc kubenswrapper[4781]: I1202 10:34:00.434491 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_e9f84c1d-0c33-4c5a-8e53-a5cf635e4a68/nova-cell1-novncproxy-novncproxy/0.log" Dec 02 10:34:00 crc kubenswrapper[4781]: E1202 10:34:00.612998 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:34:00 crc kubenswrapper[4781]: I1202 10:34:00.655812 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b701c328-f693-4c11-96a7-5ff5b9bff2c1/nova-metadata-log/0.log" Dec 02 10:34:00 crc kubenswrapper[4781]: I1202 10:34:00.667436 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-g88rg_dacd1d88-ff6e-4719-a24f-4feb0559f463/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:34:00 crc kubenswrapper[4781]: I1202 10:34:00.948800 4781 generic.go:334] "Generic (PLEG): container finished" podID="e10258da-dad3-4df8-82c2-9d9438493a3d" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" exitCode=0 Dec 02 10:34:00 crc kubenswrapper[4781]: I1202 10:34:00.948852 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerDied","Data":"b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d"} Dec 02 10:34:00 crc kubenswrapper[4781]: I1202 10:34:00.948885 4781 scope.go:117] "RemoveContainer" containerID="11b83b3689380a7a0a856185b9c3f2221828cd55a83586474c6d42332e35e9f8" Dec 02 10:34:00 crc kubenswrapper[4781]: I1202 10:34:00.949496 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:34:00 crc kubenswrapper[4781]: E1202 10:34:00.949741 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:34:01 crc kubenswrapper[4781]: I1202 10:34:01.101287 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_05cec06e-b8e4-487d-b4f8-0691aaf1f997/mysql-bootstrap/0.log" Dec 02 10:34:01 crc kubenswrapper[4781]: I1202 10:34:01.133763 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a4460f62-792a-4297-b92b-3fe1081bc006/nova-scheduler-scheduler/0.log" Dec 02 10:34:01 crc kubenswrapper[4781]: I1202 10:34:01.290609 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_05cec06e-b8e4-487d-b4f8-0691aaf1f997/mysql-bootstrap/0.log" Dec 02 10:34:01 crc kubenswrapper[4781]: I1202 10:34:01.318132 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_05cec06e-b8e4-487d-b4f8-0691aaf1f997/galera/0.log" Dec 02 10:34:01 crc kubenswrapper[4781]: I1202 10:34:01.963892 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_39428d69-6b86-41ea-8c4f-5532a5283a91/mysql-bootstrap/0.log" Dec 02 10:34:02 crc kubenswrapper[4781]: I1202 10:34:02.074853 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b701c328-f693-4c11-96a7-5ff5b9bff2c1/nova-metadata-metadata/0.log" Dec 02 10:34:02 crc kubenswrapper[4781]: I1202 10:34:02.177231 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_39428d69-6b86-41ea-8c4f-5532a5283a91/mysql-bootstrap/0.log" Dec 02 10:34:02 crc kubenswrapper[4781]: I1202 10:34:02.199830 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_39428d69-6b86-41ea-8c4f-5532a5283a91/galera/0.log" Dec 02 10:34:02 crc kubenswrapper[4781]: I1202 10:34:02.328831 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_7a424309-12e4-42f9-ba35-d61f1f6c7b44/openstackclient/0.log" Dec 02 10:34:02 crc kubenswrapper[4781]: I1202 10:34:02.444969 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7c2h7_7bc2040b-e9d5-4a1a-9d46-6b50dbc71061/openstack-network-exporter/0.log" Dec 02 10:34:02 crc kubenswrapper[4781]: I1202 10:34:02.529611 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-mmhrv_7cf924dd-8243-4263-85a2-68ac01fd5346/ovn-controller/0.log" Dec 02 10:34:02 crc kubenswrapper[4781]: I1202 10:34:02.642072 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rh5wv_77d1da39-d5b0-4ec8-8196-f4f1025291f8/ovsdb-server-init/0.log" Dec 02 10:34:02 crc kubenswrapper[4781]: I1202 10:34:02.860505 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rh5wv_77d1da39-d5b0-4ec8-8196-f4f1025291f8/ovsdb-server-init/0.log" Dec 02 10:34:02 crc kubenswrapper[4781]: I1202 10:34:02.869801 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rh5wv_77d1da39-d5b0-4ec8-8196-f4f1025291f8/ovs-vswitchd/0.log" Dec 02 10:34:02 crc kubenswrapper[4781]: I1202 10:34:02.910719 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rh5wv_77d1da39-d5b0-4ec8-8196-f4f1025291f8/ovsdb-server/0.log" Dec 02 10:34:03 crc kubenswrapper[4781]: I1202 10:34:03.625708 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-nf4nh_809035c6-50b2-4492-898e-1f2917e62a5c/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:34:03 crc kubenswrapper[4781]: I1202 10:34:03.631431 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1d0908af-30ab-4017-8911-b10c3742336e/openstack-network-exporter/0.log" Dec 02 10:34:03 crc kubenswrapper[4781]: I1202 10:34:03.673506 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1d0908af-30ab-4017-8911-b10c3742336e/ovn-northd/0.log" Dec 02 10:34:03 crc kubenswrapper[4781]: I1202 10:34:03.876583 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6dea0cb6-7707-46ba-bd47-89ce579fdad9/openstack-network-exporter/0.log" Dec 02 10:34:03 crc kubenswrapper[4781]: I1202 10:34:03.896111 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6dea0cb6-7707-46ba-bd47-89ce579fdad9/ovsdbserver-nb/0.log" Dec 02 10:34:04 crc kubenswrapper[4781]: I1202 10:34:04.068784 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8c135676-b0d9-469f-82b2-59483c9712f1/openstack-network-exporter/0.log" Dec 02 10:34:04 crc kubenswrapper[4781]: I1202 10:34:04.077140 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8c135676-b0d9-469f-82b2-59483c9712f1/ovsdbserver-sb/0.log" Dec 02 10:34:04 crc kubenswrapper[4781]: I1202 10:34:04.268307 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-69459794d8-ph7dr_abd38c16-6eab-4f3e-9c4d-294b240fa154/placement-api/0.log" Dec 02 10:34:04 crc kubenswrapper[4781]: I1202 10:34:04.370186 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-69459794d8-ph7dr_abd38c16-6eab-4f3e-9c4d-294b240fa154/placement-log/0.log" Dec 02 10:34:04 crc kubenswrapper[4781]: I1202 10:34:04.401045 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6f7f1b54-7b32-4063-af85-f97785416d26/setup-container/0.log" Dec 02 10:34:04 crc kubenswrapper[4781]: I1202 10:34:04.693640 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6f7f1b54-7b32-4063-af85-f97785416d26/setup-container/0.log" Dec 02 10:34:04 crc kubenswrapper[4781]: I1202 10:34:04.732756 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6f7f1b54-7b32-4063-af85-f97785416d26/rabbitmq/0.log" Dec 02 10:34:04 crc kubenswrapper[4781]: I1202 10:34:04.750032 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4c72516d-29b4-4932-8a47-8838d686b176/setup-container/0.log" Dec 02 10:34:04 crc kubenswrapper[4781]: I1202 10:34:04.916790 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4c72516d-29b4-4932-8a47-8838d686b176/setup-container/0.log" Dec 02 10:34:04 crc kubenswrapper[4781]: I1202 10:34:04.953113 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4c72516d-29b4-4932-8a47-8838d686b176/rabbitmq/0.log" Dec 02 10:34:05 crc kubenswrapper[4781]: I1202 10:34:05.009615 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-pkdhd_e76bbee2-a10a-45f8-9767-4018dfa3836e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:34:05 crc kubenswrapper[4781]: I1202 10:34:05.256028 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-5lb5h_b84858d2-0cb7-4c83-ab7c-9ffc5aef41a6/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:34:05 crc kubenswrapper[4781]: I1202 10:34:05.330318 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-wbhzk_03ba130c-2463-4317-a907-e13c23657ae9/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:34:05 crc kubenswrapper[4781]: I1202 10:34:05.447145 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gbx4b_7687840f-e133-4b18-b37c-74664863276b/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:34:05 crc kubenswrapper[4781]: I1202 10:34:05.576426 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-cxdx6_6fe7b21a-17d2-432b-9045-e64643581770/ssh-known-hosts-edpm-deployment/0.log" Dec 02 10:34:05 crc kubenswrapper[4781]: I1202 10:34:05.787251 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7c5b6578c5-9cvpk_8666ba67-095e-4634-8975-e54bd7a0f0cb/proxy-server/0.log" Dec 02 10:34:05 crc kubenswrapper[4781]: I1202 10:34:05.841238 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7c5b6578c5-9cvpk_8666ba67-095e-4634-8975-e54bd7a0f0cb/proxy-httpd/0.log" Dec 02 10:34:05 crc kubenswrapper[4781]: I1202 10:34:05.918149 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-p9svs_d15a32bb-043f-4019-8ddb-4bcc54b243a0/swift-ring-rebalance/0.log" Dec 02 10:34:06 crc kubenswrapper[4781]: I1202 10:34:06.020801 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/account-auditor/0.log" Dec 02 10:34:06 crc kubenswrapper[4781]: I1202 10:34:06.031724 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/account-reaper/0.log" Dec 02 10:34:06 crc kubenswrapper[4781]: I1202 10:34:06.150062 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/account-replicator/0.log" Dec 02 10:34:06 crc kubenswrapper[4781]: I1202 10:34:06.206868 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/account-server/0.log" Dec 02 10:34:06 crc kubenswrapper[4781]: I1202 10:34:06.247704 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/container-auditor/0.log" Dec 02 10:34:06 crc kubenswrapper[4781]: I1202 10:34:06.357321 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/container-replicator/0.log" Dec 02 10:34:06 crc kubenswrapper[4781]: I1202 10:34:06.373064 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/container-server/0.log" Dec 02 10:34:06 crc kubenswrapper[4781]: I1202 10:34:06.433803 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/container-updater/0.log" Dec 02 10:34:06 crc kubenswrapper[4781]: I1202 10:34:06.530294 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/object-auditor/0.log" Dec 02 10:34:06 crc kubenswrapper[4781]: I1202 10:34:06.571973 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/object-expirer/0.log" Dec 02 10:34:06 crc kubenswrapper[4781]: I1202 10:34:06.578488 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/object-replicator/0.log" Dec 02 10:34:06 crc kubenswrapper[4781]: I1202 10:34:06.647329 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/object-server/0.log" Dec 02 10:34:06 crc kubenswrapper[4781]: I1202 10:34:06.733213 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/object-updater/0.log" Dec 02 10:34:06 crc kubenswrapper[4781]: I1202 10:34:06.771692 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/swift-recon-cron/0.log" Dec 02 10:34:06 crc kubenswrapper[4781]: I1202 10:34:06.804607 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d8cf953d-c4c9-457e-956c-d2942b56499b/rsync/0.log" Dec 02 10:34:07 crc kubenswrapper[4781]: I1202 10:34:07.011081 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-c5c5n_82d7f2c1-dd68-42f7-be56-30bc507b2bf5/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:34:07 crc kubenswrapper[4781]: I1202 10:34:07.031940 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ac589c1e-71f7-423f-b99b-3ebf175b40f3/tempest-tests-tempest-tests-runner/0.log" Dec 02 10:34:07 crc kubenswrapper[4781]: I1202 10:34:07.270340 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_45f4fae7-b5a7-4c3c-845e-4f2daed0a787/test-operator-logs-container/0.log" Dec 02 10:34:07 crc kubenswrapper[4781]: I1202 10:34:07.333132 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-gwpv6_1a663620-4120-46ee-9676-3eaac5534b99/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 10:34:14 crc kubenswrapper[4781]: I1202 10:34:14.499157 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:34:14 crc kubenswrapper[4781]: E1202 10:34:14.499911 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:34:20 crc kubenswrapper[4781]: I1202 10:34:20.437381 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_2fd68438-412a-4745-9e59-f4c9374f2444/memcached/0.log" Dec 02 10:34:27 crc kubenswrapper[4781]: I1202 10:34:27.508897 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:34:27 crc kubenswrapper[4781]: E1202 10:34:27.510196 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:34:34 crc kubenswrapper[4781]: I1202 10:34:34.688064 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-8tkh8_f72fc870-291d-4800-a316-22de56b2ebbd/kube-rbac-proxy/0.log" Dec 02 10:34:34 crc kubenswrapper[4781]: I1202 10:34:34.771410 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-8tkh8_f72fc870-291d-4800-a316-22de56b2ebbd/manager/0.log" Dec 02 10:34:34 crc kubenswrapper[4781]: I1202 10:34:34.913312 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-jc6cp_812edfc0-b0b7-40c7-913d-b176bd6817f3/manager/0.log" Dec 02 10:34:34 crc kubenswrapper[4781]: I1202 10:34:34.923632 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-jc6cp_812edfc0-b0b7-40c7-913d-b176bd6817f3/kube-rbac-proxy/0.log" Dec 02 10:34:35 crc kubenswrapper[4781]: I1202 10:34:35.108337 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-7q6gr_eb3d207d-118c-42b8-9e9a-103a041a44b3/manager/0.log" Dec 02 10:34:35 crc kubenswrapper[4781]: I1202 10:34:35.125636 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-7q6gr_eb3d207d-118c-42b8-9e9a-103a041a44b3/kube-rbac-proxy/0.log" Dec 02 10:34:35 crc kubenswrapper[4781]: I1202 10:34:35.158783 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl_5da922a2-736d-4e49-b3f3-68adcbfc8d0b/util/0.log" Dec 02 10:34:35 crc kubenswrapper[4781]: I1202 10:34:35.340682 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl_5da922a2-736d-4e49-b3f3-68adcbfc8d0b/util/0.log" Dec 02 10:34:35 crc kubenswrapper[4781]: I1202 10:34:35.353333 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl_5da922a2-736d-4e49-b3f3-68adcbfc8d0b/pull/0.log" Dec 02 10:34:35 crc kubenswrapper[4781]: I1202 10:34:35.412755 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl_5da922a2-736d-4e49-b3f3-68adcbfc8d0b/pull/0.log" Dec 02 10:34:35 crc kubenswrapper[4781]: I1202 10:34:35.550885 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl_5da922a2-736d-4e49-b3f3-68adcbfc8d0b/util/0.log" Dec 02 10:34:35 crc kubenswrapper[4781]: I1202 10:34:35.551401 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl_5da922a2-736d-4e49-b3f3-68adcbfc8d0b/pull/0.log" Dec 02 10:34:35 crc kubenswrapper[4781]: I1202 10:34:35.554337 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fdef1de09061a943cf0eb5556b972c2f56b8ba493ca7b87b57c4549318f7mcl_5da922a2-736d-4e49-b3f3-68adcbfc8d0b/extract/0.log" Dec 02 10:34:35 crc kubenswrapper[4781]: I1202 10:34:35.713448 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-dz7tw_de57f174-daf9-483d-bac6-e735d25f9d64/kube-rbac-proxy/0.log" Dec 02 10:34:35 crc kubenswrapper[4781]: I1202 10:34:35.797338 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-dz7tw_de57f174-daf9-483d-bac6-e735d25f9d64/manager/0.log" Dec 02 10:34:35 crc kubenswrapper[4781]: I1202 10:34:35.812742 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-wcvq5_0a36fb64-e101-44af-a6f9-91fb68fc1e7a/kube-rbac-proxy/0.log" Dec 02 10:34:35 crc kubenswrapper[4781]: I1202 10:34:35.941260 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-wcvq5_0a36fb64-e101-44af-a6f9-91fb68fc1e7a/manager/0.log" Dec 02 10:34:36 crc kubenswrapper[4781]: I1202 10:34:36.003881 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-gplp6_15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89/kube-rbac-proxy/0.log" Dec 02 10:34:36 crc kubenswrapper[4781]: I1202 10:34:36.006670 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-gplp6_15c8fb6a-82b4-4e87-ae9c-6efc2afc5c89/manager/0.log" Dec 02 10:34:36 crc kubenswrapper[4781]: I1202 10:34:36.169033 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-lwgs7_cf3e832e-6140-4880-9efd-017837fc9990/kube-rbac-proxy/0.log" Dec 02 10:34:36 crc kubenswrapper[4781]: I1202 10:34:36.352673 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-lwgs7_cf3e832e-6140-4880-9efd-017837fc9990/manager/0.log" Dec 02 10:34:36 crc kubenswrapper[4781]: I1202 10:34:36.380644 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-qpdqt_428e69aa-23f5-4d45-8c18-65ac62c6756c/kube-rbac-proxy/0.log" Dec 02 10:34:36 crc kubenswrapper[4781]: I1202 10:34:36.465149 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-qpdqt_428e69aa-23f5-4d45-8c18-65ac62c6756c/manager/0.log" Dec 02 10:34:36 crc kubenswrapper[4781]: I1202 10:34:36.594773 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-8mm6f_9d08e6b1-b9b9-4a7e-a859-e98f904e2588/kube-rbac-proxy/0.log" Dec 02 10:34:36 crc kubenswrapper[4781]: I1202 10:34:36.718795 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-5lhcx_f1be99ff-4068-4454-b75d-770951e9fedd/kube-rbac-proxy/0.log" Dec 02 10:34:36 crc kubenswrapper[4781]: I1202 10:34:36.736379 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-8mm6f_9d08e6b1-b9b9-4a7e-a859-e98f904e2588/manager/0.log" Dec 02 10:34:36 crc kubenswrapper[4781]: I1202 10:34:36.820403 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-5lhcx_f1be99ff-4068-4454-b75d-770951e9fedd/manager/0.log" Dec 02 10:34:36 crc kubenswrapper[4781]: I1202 10:34:36.964341 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-8xclz_c917a8ec-2bd5-4f7b-8948-a4bed859e01f/kube-rbac-proxy/0.log" Dec 02 10:34:36 crc kubenswrapper[4781]: I1202 10:34:36.987616 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-8xclz_c917a8ec-2bd5-4f7b-8948-a4bed859e01f/manager/0.log" Dec 02 10:34:37 crc kubenswrapper[4781]: I1202 10:34:37.132267 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-rf4bw_7b8c261d-133c-4a73-9424-3233e6701fff/kube-rbac-proxy/0.log" Dec 02 10:34:37 crc kubenswrapper[4781]: I1202 10:34:37.197850 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-rf4bw_7b8c261d-133c-4a73-9424-3233e6701fff/manager/0.log" Dec 02 10:34:37 crc kubenswrapper[4781]: I1202 10:34:37.294712 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-chbnm_71cfd08b-278c-4f9c-b0fd-198c662ef00d/kube-rbac-proxy/0.log" Dec 02 10:34:37 crc kubenswrapper[4781]: I1202 10:34:37.422171 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-chbnm_71cfd08b-278c-4f9c-b0fd-198c662ef00d/manager/0.log" Dec 02 10:34:37 crc kubenswrapper[4781]: I1202 10:34:37.464323 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-pkkst_bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff/kube-rbac-proxy/0.log" Dec 02 10:34:37 crc kubenswrapper[4781]: I1202 10:34:37.500202 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-pkkst_bbcf3ac8-e087-4a1c-b9f8-2263e15a73ff/manager/0.log" Dec 02 10:34:37 crc kubenswrapper[4781]: I1202 10:34:37.654171 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t_cda1cc86-51ab-4070-96e9-98adba5d51c3/kube-rbac-proxy/0.log" Dec 02 10:34:37 crc kubenswrapper[4781]: I1202 10:34:37.660166 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4rgl7t_cda1cc86-51ab-4070-96e9-98adba5d51c3/manager/0.log" Dec 02 10:34:37 crc kubenswrapper[4781]: I1202 10:34:37.993706 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-65bb796c9b-zrllk_09f22b82-ec27-4398-b843-8be7661ed03a/operator/0.log" Dec 02 10:34:38 crc kubenswrapper[4781]: I1202 10:34:38.061689 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-xg78j_0ac791f1-2459-4266-a082-498b66e549b4/registry-server/0.log" Dec 02 10:34:38 crc kubenswrapper[4781]: I1202 10:34:38.252809 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-fq4g5_15aa56a1-c9d3-4e48-a0fe-19e593320728/kube-rbac-proxy/0.log" Dec 02 10:34:38 crc kubenswrapper[4781]: I1202 10:34:38.365957 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-fq4g5_15aa56a1-c9d3-4e48-a0fe-19e593320728/manager/0.log" Dec 02 10:34:38 crc kubenswrapper[4781]: I1202 10:34:38.398101 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-8rtwl_ca4903e5-bed6-47c2-82a5-4376b162ec96/kube-rbac-proxy/0.log" Dec 02 10:34:38 crc kubenswrapper[4781]: I1202 10:34:38.502802 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:34:38 crc kubenswrapper[4781]: E1202 10:34:38.503053 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:34:38 crc kubenswrapper[4781]: I1202 10:34:38.658178 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-8rtwl_ca4903e5-bed6-47c2-82a5-4376b162ec96/manager/0.log" Dec 02 10:34:38 crc kubenswrapper[4781]: I1202 10:34:38.762377 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-7gsfd_8c93d2b8-28d1-4ea7-86aa-fe3f8ac35e61/operator/0.log" Dec 02 10:34:38 crc kubenswrapper[4781]: I1202 10:34:38.819205 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-68mk9_22dc2882-6dfd-4f1c-90ee-06ac4c9e0aa5/kube-rbac-proxy/0.log" Dec 02 10:34:38 crc kubenswrapper[4781]: I1202 10:34:38.860730 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-686764c46-54r7r_090bc2d1-e1c5-4721-80ab-e20d4f3942c6/manager/0.log" Dec 02 10:34:38 crc kubenswrapper[4781]: I1202 10:34:38.951105 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-68mk9_22dc2882-6dfd-4f1c-90ee-06ac4c9e0aa5/manager/0.log" Dec 02 10:34:39 crc kubenswrapper[4781]: I1202 10:34:39.043007 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-xtks6_cfd47a1e-a773-4479-9656-abb353f87fe9/kube-rbac-proxy/0.log" Dec 02 10:34:39 crc kubenswrapper[4781]: I1202 10:34:39.069828 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-xtks6_cfd47a1e-a773-4479-9656-abb353f87fe9/manager/0.log" Dec 02 10:34:39 crc kubenswrapper[4781]: I1202 10:34:39.142212 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-5d9tj_1c5a954b-d61b-4d33-a043-407f8de059a6/kube-rbac-proxy/0.log" Dec 02 10:34:39 crc kubenswrapper[4781]: I1202 10:34:39.182221 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-5d9tj_1c5a954b-d61b-4d33-a043-407f8de059a6/manager/0.log" Dec 02 10:34:39 crc kubenswrapper[4781]: I1202 10:34:39.240407 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-4dtmh_7f59a11b-4502-4c7b-94e7-3fbb6bac2222/kube-rbac-proxy/0.log" Dec 02 10:34:39 crc kubenswrapper[4781]: I1202 10:34:39.264873 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-4dtmh_7f59a11b-4502-4c7b-94e7-3fbb6bac2222/manager/0.log" Dec 02 10:34:53 crc kubenswrapper[4781]: I1202 10:34:53.501153 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:34:53 crc kubenswrapper[4781]: E1202 10:34:53.502266 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:34:58 crc kubenswrapper[4781]: I1202 10:34:58.873020 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-nm2f4_50d4d4c8-66e1-4d10-85cb-0fee6079d5fe/control-plane-machine-set-operator/0.log" Dec 02 10:34:59 crc kubenswrapper[4781]: I1202 10:34:59.083198 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-n9cmc_d9c52f13-f9c6-419e-8f69-ee91e29f4629/machine-api-operator/0.log" Dec 02 10:34:59 crc kubenswrapper[4781]: I1202 10:34:59.086307 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-n9cmc_d9c52f13-f9c6-419e-8f69-ee91e29f4629/kube-rbac-proxy/0.log" Dec 02 10:35:05 crc kubenswrapper[4781]: I1202 10:35:05.500033 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:35:05 crc kubenswrapper[4781]: E1202 10:35:05.501075 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:35:12 crc kubenswrapper[4781]: I1202 10:35:12.235432 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-nhlj8_e57350a9-3e11-4df0-a108-a9d8d446c219/cert-manager-controller/0.log" Dec 02 10:35:12 crc kubenswrapper[4781]: I1202 10:35:12.410355 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-5jxwz_bbeb0421-437b-4e00-a072-ebebf3354bea/cert-manager-cainjector/0.log" Dec 02 10:35:13 crc kubenswrapper[4781]: I1202 10:35:13.002393 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-zl2kl_753e9900-fde9-4486-9bc3-fce98f302367/cert-manager-webhook/0.log" Dec 02 10:35:20 crc kubenswrapper[4781]: I1202 10:35:20.500466 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:35:20 crc kubenswrapper[4781]: E1202 10:35:20.501517 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:35:25 crc kubenswrapper[4781]: I1202 10:35:25.482363 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-xqj9x_9f057938-ae0d-4ce5-a920-b34905dcab9a/nmstate-console-plugin/0.log" Dec 02 10:35:25 crc kubenswrapper[4781]: I1202 10:35:25.848453 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-b4bs8_71f173d9-47d0-4576-991f-0eeffb003596/kube-rbac-proxy/0.log" Dec 02 10:35:25 crc kubenswrapper[4781]: I1202 10:35:25.859133 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-b4bs8_71f173d9-47d0-4576-991f-0eeffb003596/nmstate-metrics/0.log" Dec 02 10:35:25 crc kubenswrapper[4781]: I1202 10:35:25.877159 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-h2jq9_acd034d2-af9c-49ff-a584-f7f0ef482c10/nmstate-handler/0.log" Dec 02 10:35:26 crc kubenswrapper[4781]: I1202 10:35:26.032103 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-dq5sv_de15e7ea-d9ce-4713-bb63-87db8b3c5afd/nmstate-webhook/0.log" Dec 02 10:35:26 crc kubenswrapper[4781]: I1202 10:35:26.034296 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-99vk9_11d94cf1-4ea7-43cb-b7e5-6fc0be34760f/nmstate-operator/0.log" Dec 02 10:35:33 crc kubenswrapper[4781]: I1202 10:35:33.504639 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:35:33 crc kubenswrapper[4781]: E1202 10:35:33.505419 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:35:41 crc kubenswrapper[4781]: I1202 10:35:41.562653 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-8fm55_b70a26b7-43cb-4e26-95c0-f67ef15a0c34/kube-rbac-proxy/0.log" Dec 02 10:35:41 crc kubenswrapper[4781]: I1202 10:35:41.638491 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-8fm55_b70a26b7-43cb-4e26-95c0-f67ef15a0c34/controller/0.log" Dec 02 10:35:41 crc kubenswrapper[4781]: I1202 10:35:41.751467 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-frr-files/0.log" Dec 02 10:35:41 crc kubenswrapper[4781]: I1202 10:35:41.924260 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-frr-files/0.log" Dec 02 10:35:41 crc kubenswrapper[4781]: I1202 10:35:41.938780 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-reloader/0.log" Dec 02 10:35:41 crc kubenswrapper[4781]: I1202 10:35:41.941421 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-metrics/0.log" Dec 02 10:35:42 crc kubenswrapper[4781]: I1202 10:35:42.003986 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-reloader/0.log" Dec 02 10:35:42 crc kubenswrapper[4781]: I1202 10:35:42.122154 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-frr-files/0.log" Dec 02 10:35:42 crc kubenswrapper[4781]: I1202 10:35:42.127168 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-reloader/0.log" Dec 02 10:35:42 crc kubenswrapper[4781]: I1202 10:35:42.177045 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-metrics/0.log" Dec 02 10:35:42 crc kubenswrapper[4781]: I1202 10:35:42.196998 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-metrics/0.log" Dec 02 10:35:42 crc kubenswrapper[4781]: I1202 10:35:42.710047 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-frr-files/0.log" Dec 02 10:35:42 crc kubenswrapper[4781]: I1202 10:35:42.921685 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-reloader/0.log" Dec 02 10:35:42 crc kubenswrapper[4781]: I1202 10:35:42.952614 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/cp-metrics/0.log" Dec 02 10:35:42 crc kubenswrapper[4781]: I1202 10:35:42.964204 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/controller/0.log" Dec 02 10:35:43 crc kubenswrapper[4781]: I1202 10:35:43.120605 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/kube-rbac-proxy/0.log" Dec 02 10:35:43 crc kubenswrapper[4781]: I1202 10:35:43.167976 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/frr-metrics/0.log" Dec 02 10:35:43 crc kubenswrapper[4781]: I1202 10:35:43.202331 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/kube-rbac-proxy-frr/0.log" Dec 02 10:35:43 crc kubenswrapper[4781]: I1202 10:35:43.354155 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/reloader/0.log" Dec 02 10:35:43 crc kubenswrapper[4781]: I1202 10:35:43.383614 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-mfpkp_e4edb853-0703-424a-9701-bd01ffa5631c/frr-k8s-webhook-server/0.log" Dec 02 10:35:43 crc kubenswrapper[4781]: I1202 10:35:43.702688 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-75f8895565-ttkrd_60dee968-291d-4e9d-b2a5-40b67457b003/manager/0.log" Dec 02 10:35:43 crc kubenswrapper[4781]: I1202 10:35:43.894468 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-w7lhj_9033e241-ad62-4fcc-92f1-8499f42f6310/kube-rbac-proxy/0.log" Dec 02 10:35:43 crc kubenswrapper[4781]: I1202 10:35:43.909503 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6f95b97b7b-7p5dk_804d6dfe-f063-4edb-b276-9a386bae049a/webhook-server/0.log" Dec 02 10:35:44 crc kubenswrapper[4781]: I1202 10:35:44.444360 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p2rfl_920e48e1-7f21-462b-82da-70c9a6e589ba/frr/0.log" Dec 02 10:35:44 crc kubenswrapper[4781]: I1202 10:35:44.514336 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-w7lhj_9033e241-ad62-4fcc-92f1-8499f42f6310/speaker/0.log" Dec 02 10:35:46 crc kubenswrapper[4781]: I1202 10:35:46.499576 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:35:46 crc kubenswrapper[4781]: E1202 10:35:46.500140 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:35:56 crc kubenswrapper[4781]: I1202 10:35:56.615853 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c_8add967d-359e-4b2f-8181-27a4e32cd3d1/util/0.log" Dec 02 10:35:56 crc kubenswrapper[4781]: I1202 10:35:56.838035 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c_8add967d-359e-4b2f-8181-27a4e32cd3d1/pull/0.log" Dec 02 10:35:56 crc kubenswrapper[4781]: I1202 10:35:56.843915 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c_8add967d-359e-4b2f-8181-27a4e32cd3d1/util/0.log" Dec 02 10:35:56 crc kubenswrapper[4781]: I1202 10:35:56.846289 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c_8add967d-359e-4b2f-8181-27a4e32cd3d1/pull/0.log" Dec 02 10:35:57 crc kubenswrapper[4781]: I1202 10:35:57.007312 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c_8add967d-359e-4b2f-8181-27a4e32cd3d1/util/0.log" Dec 02 10:35:57 crc kubenswrapper[4781]: I1202 10:35:57.012176 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c_8add967d-359e-4b2f-8181-27a4e32cd3d1/pull/0.log" Dec 02 10:35:57 crc kubenswrapper[4781]: I1202 10:35:57.025684 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fl4k6c_8add967d-359e-4b2f-8181-27a4e32cd3d1/extract/0.log" Dec 02 10:35:57 crc kubenswrapper[4781]: I1202 10:35:57.165701 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt_a30d0113-e2bc-4a14-b7b0-49363876c6b2/util/0.log" Dec 02 10:35:57 crc kubenswrapper[4781]: I1202 10:35:57.345881 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt_a30d0113-e2bc-4a14-b7b0-49363876c6b2/pull/0.log" Dec 02 10:35:57 crc kubenswrapper[4781]: I1202 10:35:57.359350 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt_a30d0113-e2bc-4a14-b7b0-49363876c6b2/util/0.log" Dec 02 10:35:57 crc kubenswrapper[4781]: I1202 10:35:57.381039 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt_a30d0113-e2bc-4a14-b7b0-49363876c6b2/pull/0.log" Dec 02 10:35:57 crc kubenswrapper[4781]: I1202 10:35:57.510162 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:35:57 crc kubenswrapper[4781]: E1202 10:35:57.510431 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:35:57 crc kubenswrapper[4781]: I1202 10:35:57.547438 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt_a30d0113-e2bc-4a14-b7b0-49363876c6b2/pull/0.log" Dec 02 10:35:57 crc kubenswrapper[4781]: I1202 10:35:57.550631 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt_a30d0113-e2bc-4a14-b7b0-49363876c6b2/util/0.log" Dec 02 10:35:57 crc kubenswrapper[4781]: I1202 10:35:57.570152 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836fnxt_a30d0113-e2bc-4a14-b7b0-49363876c6b2/extract/0.log" Dec 02 10:35:57 crc kubenswrapper[4781]: I1202 10:35:57.704866 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhncm_8267358b-15b0-44fd-bd0a-73438bcd7ded/extract-utilities/0.log" Dec 02 10:35:57 crc kubenswrapper[4781]: I1202 10:35:57.893579 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhncm_8267358b-15b0-44fd-bd0a-73438bcd7ded/extract-content/0.log" Dec 02 10:35:57 crc kubenswrapper[4781]: I1202 10:35:57.904376 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhncm_8267358b-15b0-44fd-bd0a-73438bcd7ded/extract-content/0.log" Dec 02 10:35:57 crc kubenswrapper[4781]: I1202 10:35:57.908514 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhncm_8267358b-15b0-44fd-bd0a-73438bcd7ded/extract-utilities/0.log" Dec 02 10:35:58 crc kubenswrapper[4781]: I1202 10:35:58.095024 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhncm_8267358b-15b0-44fd-bd0a-73438bcd7ded/extract-utilities/0.log" Dec 02 10:35:58 crc kubenswrapper[4781]: I1202 10:35:58.099448 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhncm_8267358b-15b0-44fd-bd0a-73438bcd7ded/extract-content/0.log" Dec 02 10:35:58 crc kubenswrapper[4781]: I1202 10:35:58.297228 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhncm_8267358b-15b0-44fd-bd0a-73438bcd7ded/registry-server/0.log" Dec 02 10:35:58 crc kubenswrapper[4781]: I1202 10:35:58.298586 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rv6xb_10e7b109-cde2-4b4f-8f1c-2d7940a49e1c/extract-utilities/0.log" Dec 02 10:35:58 crc kubenswrapper[4781]: I1202 10:35:58.457009 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rv6xb_10e7b109-cde2-4b4f-8f1c-2d7940a49e1c/extract-utilities/0.log" Dec 02 10:35:58 crc kubenswrapper[4781]: I1202 10:35:58.463888 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rv6xb_10e7b109-cde2-4b4f-8f1c-2d7940a49e1c/extract-content/0.log" Dec 02 10:35:58 crc kubenswrapper[4781]: I1202 10:35:58.492127 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rv6xb_10e7b109-cde2-4b4f-8f1c-2d7940a49e1c/extract-content/0.log" Dec 02 10:35:58 crc kubenswrapper[4781]: I1202 10:35:58.626800 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rv6xb_10e7b109-cde2-4b4f-8f1c-2d7940a49e1c/extract-content/0.log" Dec 02 10:35:58 crc kubenswrapper[4781]: I1202 10:35:58.693239 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rv6xb_10e7b109-cde2-4b4f-8f1c-2d7940a49e1c/extract-utilities/0.log" Dec 02 10:35:58 crc kubenswrapper[4781]: I1202 10:35:58.968597 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-lx2rp_060db2b9-0086-4429-8ade-2156f94455f4/marketplace-operator/0.log" Dec 02 10:35:59 crc kubenswrapper[4781]: I1202 10:35:59.089173 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6s4mn_1996ded9-7aca-4ca6-b787-1c688c678893/extract-utilities/0.log" Dec 02 10:35:59 crc kubenswrapper[4781]: I1202 10:35:59.295368 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rv6xb_10e7b109-cde2-4b4f-8f1c-2d7940a49e1c/registry-server/0.log" Dec 02 10:35:59 crc kubenswrapper[4781]: I1202 10:35:59.310731 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6s4mn_1996ded9-7aca-4ca6-b787-1c688c678893/extract-utilities/0.log" Dec 02 10:35:59 crc kubenswrapper[4781]: I1202 10:35:59.316882 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6s4mn_1996ded9-7aca-4ca6-b787-1c688c678893/extract-content/0.log" Dec 02 10:35:59 crc kubenswrapper[4781]: I1202 10:35:59.332351 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6s4mn_1996ded9-7aca-4ca6-b787-1c688c678893/extract-content/0.log" Dec 02 10:35:59 crc kubenswrapper[4781]: I1202 10:35:59.503968 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6s4mn_1996ded9-7aca-4ca6-b787-1c688c678893/extract-content/0.log" Dec 02 10:35:59 crc kubenswrapper[4781]: I1202 10:35:59.506366 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6s4mn_1996ded9-7aca-4ca6-b787-1c688c678893/extract-utilities/0.log" Dec 02 10:35:59 crc kubenswrapper[4781]: I1202 10:35:59.670676 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6s4mn_1996ded9-7aca-4ca6-b787-1c688c678893/registry-server/0.log" Dec 02 10:35:59 crc kubenswrapper[4781]: I1202 10:35:59.907122 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w2wcs_5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4/extract-utilities/0.log" Dec 02 10:36:00 crc kubenswrapper[4781]: I1202 10:36:00.106858 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w2wcs_5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4/extract-content/0.log" Dec 02 10:36:00 crc kubenswrapper[4781]: I1202 10:36:00.137848 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w2wcs_5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4/extract-content/0.log" Dec 02 10:36:00 crc kubenswrapper[4781]: I1202 10:36:00.146063 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w2wcs_5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4/extract-utilities/0.log" Dec 02 10:36:00 crc kubenswrapper[4781]: I1202 10:36:00.289559 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w2wcs_5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4/extract-utilities/0.log" Dec 02 10:36:00 crc kubenswrapper[4781]: I1202 10:36:00.336644 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w2wcs_5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4/extract-content/0.log" Dec 02 10:36:00 crc kubenswrapper[4781]: I1202 10:36:00.867206 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w2wcs_5d8a5cfa-0f4a-4912-ad38-2ddfa06bbcf4/registry-server/0.log" Dec 02 10:36:09 crc kubenswrapper[4781]: I1202 10:36:09.500157 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:36:09 crc kubenswrapper[4781]: E1202 10:36:09.500868 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:36:22 crc kubenswrapper[4781]: I1202 10:36:22.499991 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:36:22 crc kubenswrapper[4781]: E1202 10:36:22.500755 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:36:37 crc kubenswrapper[4781]: I1202 10:36:37.540251 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:36:37 crc kubenswrapper[4781]: E1202 10:36:37.541328 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:36:52 crc kubenswrapper[4781]: I1202 10:36:52.499649 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:36:52 crc kubenswrapper[4781]: E1202 10:36:52.500396 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:37:04 crc kubenswrapper[4781]: I1202 10:37:04.500429 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:37:04 crc kubenswrapper[4781]: E1202 10:37:04.501538 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:37:15 crc kubenswrapper[4781]: I1202 10:37:15.500501 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:37:15 crc kubenswrapper[4781]: E1202 10:37:15.502059 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:37:29 crc kubenswrapper[4781]: I1202 10:37:29.499954 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:37:29 crc kubenswrapper[4781]: E1202 10:37:29.501112 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:37:36 crc kubenswrapper[4781]: I1202 10:37:36.959866 4781 generic.go:334] "Generic (PLEG): container finished" podID="07e367c2-2f0f-43fd-9ab4-85a99e1a8291" containerID="3dd819b81667c1642a1945a786b9713fc3ac0751d43bc6583513971a8b1fd5ed" exitCode=0 Dec 02 10:37:36 crc kubenswrapper[4781]: I1202 10:37:36.960047 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzm46/must-gather-np56k" event={"ID":"07e367c2-2f0f-43fd-9ab4-85a99e1a8291","Type":"ContainerDied","Data":"3dd819b81667c1642a1945a786b9713fc3ac0751d43bc6583513971a8b1fd5ed"} Dec 02 10:37:36 crc kubenswrapper[4781]: I1202 10:37:36.961019 4781 scope.go:117] "RemoveContainer" containerID="3dd819b81667c1642a1945a786b9713fc3ac0751d43bc6583513971a8b1fd5ed" Dec 02 10:37:37 crc kubenswrapper[4781]: I1202 10:37:37.814151 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kzm46_must-gather-np56k_07e367c2-2f0f-43fd-9ab4-85a99e1a8291/gather/0.log" Dec 02 10:37:40 crc kubenswrapper[4781]: I1202 10:37:40.500260 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:37:40 crc kubenswrapper[4781]: E1202 10:37:40.500893 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:37:48 crc kubenswrapper[4781]: I1202 10:37:48.051735 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kzm46/must-gather-np56k"] Dec 02 10:37:48 crc kubenswrapper[4781]: I1202 10:37:48.052444 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kzm46/must-gather-np56k" podUID="07e367c2-2f0f-43fd-9ab4-85a99e1a8291" containerName="copy" containerID="cri-o://adda2635d32a58b0257b7e70e101b06664d2436348ba286e7a10cad95f00fe48" gracePeriod=2 Dec 02 10:37:48 crc kubenswrapper[4781]: I1202 10:37:48.071193 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kzm46/must-gather-np56k"] Dec 02 10:37:48 crc kubenswrapper[4781]: I1202 10:37:48.511980 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kzm46_must-gather-np56k_07e367c2-2f0f-43fd-9ab4-85a99e1a8291/copy/0.log" Dec 02 10:37:48 crc kubenswrapper[4781]: I1202 10:37:48.512440 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzm46/must-gather-np56k" Dec 02 10:37:48 crc kubenswrapper[4781]: I1202 10:37:48.611540 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/07e367c2-2f0f-43fd-9ab4-85a99e1a8291-must-gather-output\") pod \"07e367c2-2f0f-43fd-9ab4-85a99e1a8291\" (UID: \"07e367c2-2f0f-43fd-9ab4-85a99e1a8291\") " Dec 02 10:37:48 crc kubenswrapper[4781]: I1202 10:37:48.611983 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtv4w\" (UniqueName: \"kubernetes.io/projected/07e367c2-2f0f-43fd-9ab4-85a99e1a8291-kube-api-access-qtv4w\") pod \"07e367c2-2f0f-43fd-9ab4-85a99e1a8291\" (UID: \"07e367c2-2f0f-43fd-9ab4-85a99e1a8291\") " Dec 02 10:37:48 crc kubenswrapper[4781]: I1202 10:37:48.618851 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07e367c2-2f0f-43fd-9ab4-85a99e1a8291-kube-api-access-qtv4w" (OuterVolumeSpecName: "kube-api-access-qtv4w") pod "07e367c2-2f0f-43fd-9ab4-85a99e1a8291" (UID: "07e367c2-2f0f-43fd-9ab4-85a99e1a8291"). InnerVolumeSpecName "kube-api-access-qtv4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:37:48 crc kubenswrapper[4781]: I1202 10:37:48.713729 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtv4w\" (UniqueName: \"kubernetes.io/projected/07e367c2-2f0f-43fd-9ab4-85a99e1a8291-kube-api-access-qtv4w\") on node \"crc\" DevicePath \"\"" Dec 02 10:37:48 crc kubenswrapper[4781]: I1202 10:37:48.751236 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07e367c2-2f0f-43fd-9ab4-85a99e1a8291-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "07e367c2-2f0f-43fd-9ab4-85a99e1a8291" (UID: "07e367c2-2f0f-43fd-9ab4-85a99e1a8291"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:37:48 crc kubenswrapper[4781]: I1202 10:37:48.815175 4781 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/07e367c2-2f0f-43fd-9ab4-85a99e1a8291-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 02 10:37:49 crc kubenswrapper[4781]: I1202 10:37:49.097253 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kzm46_must-gather-np56k_07e367c2-2f0f-43fd-9ab4-85a99e1a8291/copy/0.log" Dec 02 10:37:49 crc kubenswrapper[4781]: I1202 10:37:49.097751 4781 generic.go:334] "Generic (PLEG): container finished" podID="07e367c2-2f0f-43fd-9ab4-85a99e1a8291" containerID="adda2635d32a58b0257b7e70e101b06664d2436348ba286e7a10cad95f00fe48" exitCode=143 Dec 02 10:37:49 crc kubenswrapper[4781]: I1202 10:37:49.097802 4781 scope.go:117] "RemoveContainer" containerID="adda2635d32a58b0257b7e70e101b06664d2436348ba286e7a10cad95f00fe48" Dec 02 10:37:49 crc kubenswrapper[4781]: I1202 10:37:49.097817 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzm46/must-gather-np56k" Dec 02 10:37:49 crc kubenswrapper[4781]: I1202 10:37:49.117612 4781 scope.go:117] "RemoveContainer" containerID="3dd819b81667c1642a1945a786b9713fc3ac0751d43bc6583513971a8b1fd5ed" Dec 02 10:37:49 crc kubenswrapper[4781]: I1202 10:37:49.191651 4781 scope.go:117] "RemoveContainer" containerID="adda2635d32a58b0257b7e70e101b06664d2436348ba286e7a10cad95f00fe48" Dec 02 10:37:49 crc kubenswrapper[4781]: E1202 10:37:49.192105 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adda2635d32a58b0257b7e70e101b06664d2436348ba286e7a10cad95f00fe48\": container with ID starting with adda2635d32a58b0257b7e70e101b06664d2436348ba286e7a10cad95f00fe48 not found: ID does not exist" containerID="adda2635d32a58b0257b7e70e101b06664d2436348ba286e7a10cad95f00fe48" Dec 02 10:37:49 crc kubenswrapper[4781]: I1202 10:37:49.192145 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adda2635d32a58b0257b7e70e101b06664d2436348ba286e7a10cad95f00fe48"} err="failed to get container status \"adda2635d32a58b0257b7e70e101b06664d2436348ba286e7a10cad95f00fe48\": rpc error: code = NotFound desc = could not find container \"adda2635d32a58b0257b7e70e101b06664d2436348ba286e7a10cad95f00fe48\": container with ID starting with adda2635d32a58b0257b7e70e101b06664d2436348ba286e7a10cad95f00fe48 not found: ID does not exist" Dec 02 10:37:49 crc kubenswrapper[4781]: I1202 10:37:49.192171 4781 scope.go:117] "RemoveContainer" containerID="3dd819b81667c1642a1945a786b9713fc3ac0751d43bc6583513971a8b1fd5ed" Dec 02 10:37:49 crc kubenswrapper[4781]: E1202 10:37:49.192454 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dd819b81667c1642a1945a786b9713fc3ac0751d43bc6583513971a8b1fd5ed\": container with ID starting with 3dd819b81667c1642a1945a786b9713fc3ac0751d43bc6583513971a8b1fd5ed not found: ID does not exist" containerID="3dd819b81667c1642a1945a786b9713fc3ac0751d43bc6583513971a8b1fd5ed" Dec 02 10:37:49 crc kubenswrapper[4781]: I1202 10:37:49.192480 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dd819b81667c1642a1945a786b9713fc3ac0751d43bc6583513971a8b1fd5ed"} err="failed to get container status \"3dd819b81667c1642a1945a786b9713fc3ac0751d43bc6583513971a8b1fd5ed\": rpc error: code = NotFound desc = could not find container \"3dd819b81667c1642a1945a786b9713fc3ac0751d43bc6583513971a8b1fd5ed\": container with ID starting with 3dd819b81667c1642a1945a786b9713fc3ac0751d43bc6583513971a8b1fd5ed not found: ID does not exist" Dec 02 10:37:49 crc kubenswrapper[4781]: I1202 10:37:49.509029 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07e367c2-2f0f-43fd-9ab4-85a99e1a8291" path="/var/lib/kubelet/pods/07e367c2-2f0f-43fd-9ab4-85a99e1a8291/volumes" Dec 02 10:37:55 crc kubenswrapper[4781]: I1202 10:37:55.499901 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:37:55 crc kubenswrapper[4781]: E1202 10:37:55.501271 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:38:10 crc kubenswrapper[4781]: I1202 10:38:10.500427 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:38:10 crc kubenswrapper[4781]: E1202 10:38:10.501026 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:38:25 crc kubenswrapper[4781]: I1202 10:38:25.499573 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:38:25 crc kubenswrapper[4781]: E1202 10:38:25.500532 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:38:36 crc kubenswrapper[4781]: I1202 10:38:36.500780 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:38:36 crc kubenswrapper[4781]: E1202 10:38:36.501902 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:38:45 crc kubenswrapper[4781]: I1202 10:38:45.844957 4781 scope.go:117] "RemoveContainer" containerID="dac398c50ac049691dc28dc23624df04aa7454b9dbcdec82a425b9c0d0d4c64c" Dec 02 10:38:50 crc kubenswrapper[4781]: I1202 10:38:50.499819 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:38:50 crc kubenswrapper[4781]: E1202 10:38:50.500620 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pzntm_openshift-machine-config-operator(e10258da-dad3-4df8-82c2-9d9438493a3d)\"" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" podUID="e10258da-dad3-4df8-82c2-9d9438493a3d" Dec 02 10:39:03 crc kubenswrapper[4781]: I1202 10:39:03.500445 4781 scope.go:117] "RemoveContainer" containerID="b2e57c7f5cab2071cc358e2287628c7d69746430f6e9ccee874285f870b0b84d" Dec 02 10:39:03 crc kubenswrapper[4781]: I1202 10:39:03.856994 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pzntm" event={"ID":"e10258da-dad3-4df8-82c2-9d9438493a3d","Type":"ContainerStarted","Data":"780b2685f195157959c43881fc1cf08279e0d6987f84af1a00f0f90410d06f01"} Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.373482 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zsq9b"] Dec 02 10:40:38 crc kubenswrapper[4781]: E1202 10:40:38.374447 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07e367c2-2f0f-43fd-9ab4-85a99e1a8291" containerName="gather" Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.374463 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="07e367c2-2f0f-43fd-9ab4-85a99e1a8291" containerName="gather" Dec 02 10:40:38 crc kubenswrapper[4781]: E1202 10:40:38.374503 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0134d7a-ea89-4e89-9765-4cf24591ef1a" containerName="container-00" Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.374512 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0134d7a-ea89-4e89-9765-4cf24591ef1a" containerName="container-00" Dec 02 10:40:38 crc kubenswrapper[4781]: E1202 10:40:38.374536 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07e367c2-2f0f-43fd-9ab4-85a99e1a8291" containerName="copy" Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.374544 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="07e367c2-2f0f-43fd-9ab4-85a99e1a8291" containerName="copy" Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.374822 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="07e367c2-2f0f-43fd-9ab4-85a99e1a8291" containerName="copy" Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.374848 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0134d7a-ea89-4e89-9765-4cf24591ef1a" containerName="container-00" Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.374869 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="07e367c2-2f0f-43fd-9ab4-85a99e1a8291" containerName="gather" Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.376648 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zsq9b" Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.386520 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zsq9b"] Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.402408 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6431ec9c-c651-4273-9c24-77c7767cf7fd-utilities\") pod \"redhat-marketplace-zsq9b\" (UID: \"6431ec9c-c651-4273-9c24-77c7767cf7fd\") " pod="openshift-marketplace/redhat-marketplace-zsq9b" Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.402492 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6431ec9c-c651-4273-9c24-77c7767cf7fd-catalog-content\") pod \"redhat-marketplace-zsq9b\" (UID: \"6431ec9c-c651-4273-9c24-77c7767cf7fd\") " pod="openshift-marketplace/redhat-marketplace-zsq9b" Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.402814 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg9hb\" (UniqueName: \"kubernetes.io/projected/6431ec9c-c651-4273-9c24-77c7767cf7fd-kube-api-access-zg9hb\") pod \"redhat-marketplace-zsq9b\" (UID: \"6431ec9c-c651-4273-9c24-77c7767cf7fd\") " pod="openshift-marketplace/redhat-marketplace-zsq9b" Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.503818 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg9hb\" (UniqueName: \"kubernetes.io/projected/6431ec9c-c651-4273-9c24-77c7767cf7fd-kube-api-access-zg9hb\") pod \"redhat-marketplace-zsq9b\" (UID: \"6431ec9c-c651-4273-9c24-77c7767cf7fd\") " pod="openshift-marketplace/redhat-marketplace-zsq9b" Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.503894 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6431ec9c-c651-4273-9c24-77c7767cf7fd-utilities\") pod \"redhat-marketplace-zsq9b\" (UID: \"6431ec9c-c651-4273-9c24-77c7767cf7fd\") " pod="openshift-marketplace/redhat-marketplace-zsq9b" Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.503976 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6431ec9c-c651-4273-9c24-77c7767cf7fd-catalog-content\") pod \"redhat-marketplace-zsq9b\" (UID: \"6431ec9c-c651-4273-9c24-77c7767cf7fd\") " pod="openshift-marketplace/redhat-marketplace-zsq9b" Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.504367 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6431ec9c-c651-4273-9c24-77c7767cf7fd-catalog-content\") pod \"redhat-marketplace-zsq9b\" (UID: \"6431ec9c-c651-4273-9c24-77c7767cf7fd\") " pod="openshift-marketplace/redhat-marketplace-zsq9b" Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.505122 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6431ec9c-c651-4273-9c24-77c7767cf7fd-utilities\") pod \"redhat-marketplace-zsq9b\" (UID: \"6431ec9c-c651-4273-9c24-77c7767cf7fd\") " pod="openshift-marketplace/redhat-marketplace-zsq9b" Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.529815 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg9hb\" (UniqueName: \"kubernetes.io/projected/6431ec9c-c651-4273-9c24-77c7767cf7fd-kube-api-access-zg9hb\") pod \"redhat-marketplace-zsq9b\" (UID: \"6431ec9c-c651-4273-9c24-77c7767cf7fd\") " pod="openshift-marketplace/redhat-marketplace-zsq9b" Dec 02 10:40:38 crc kubenswrapper[4781]: I1202 10:40:38.697219 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zsq9b" Dec 02 10:40:39 crc kubenswrapper[4781]: I1202 10:40:39.184277 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zsq9b"] Dec 02 10:40:39 crc kubenswrapper[4781]: I1202 10:40:39.901993 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsq9b" event={"ID":"6431ec9c-c651-4273-9c24-77c7767cf7fd","Type":"ContainerStarted","Data":"92ad01765d20bf8d28cc5808ce66add2982f5d2efb1117563fb5679e15bebcef"} Dec 02 10:40:40 crc kubenswrapper[4781]: I1202 10:40:40.916155 4781 generic.go:334] "Generic (PLEG): container finished" podID="6431ec9c-c651-4273-9c24-77c7767cf7fd" containerID="66a315fa822c1642d3a0e405ea59d02baddb75c39a71768fc52f4713ae1b6d02" exitCode=0 Dec 02 10:40:40 crc kubenswrapper[4781]: I1202 10:40:40.916224 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsq9b" event={"ID":"6431ec9c-c651-4273-9c24-77c7767cf7fd","Type":"ContainerDied","Data":"66a315fa822c1642d3a0e405ea59d02baddb75c39a71768fc52f4713ae1b6d02"} Dec 02 10:40:40 crc kubenswrapper[4781]: I1202 10:40:40.920338 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 10:40:41 crc kubenswrapper[4781]: I1202 10:40:41.928081 4781 generic.go:334] "Generic (PLEG): container finished" podID="6431ec9c-c651-4273-9c24-77c7767cf7fd" containerID="316943b381ca95d5cff9e2bbeca34b46f2417a79f95decb8f212923e119afe31" exitCode=0 Dec 02 10:40:41 crc kubenswrapper[4781]: I1202 10:40:41.928873 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsq9b" event={"ID":"6431ec9c-c651-4273-9c24-77c7767cf7fd","Type":"ContainerDied","Data":"316943b381ca95d5cff9e2bbeca34b46f2417a79f95decb8f212923e119afe31"} Dec 02 10:40:42 crc kubenswrapper[4781]: I1202 10:40:42.940406 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsq9b" event={"ID":"6431ec9c-c651-4273-9c24-77c7767cf7fd","Type":"ContainerStarted","Data":"41d296389a00e8c25c0efb5dd84d4e3df9348ae5d8e7aa2e11e08a680b30e79b"} Dec 02 10:40:42 crc kubenswrapper[4781]: I1202 10:40:42.967148 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zsq9b" podStartSLOduration=3.546701887 podStartE2EDuration="4.967129602s" podCreationTimestamp="2025-12-02 10:40:38 +0000 UTC" firstStartedPulling="2025-12-02 10:40:40.920100364 +0000 UTC m=+4803.743974243" lastFinishedPulling="2025-12-02 10:40:42.340528079 +0000 UTC m=+4805.164401958" observedRunningTime="2025-12-02 10:40:42.960116693 +0000 UTC m=+4805.783990572" watchObservedRunningTime="2025-12-02 10:40:42.967129602 +0000 UTC m=+4805.791003481" Dec 02 10:40:48 crc kubenswrapper[4781]: I1202 10:40:48.698544 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zsq9b" Dec 02 10:40:48 crc kubenswrapper[4781]: I1202 10:40:48.698882 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zsq9b" Dec 02 10:40:48 crc kubenswrapper[4781]: I1202 10:40:48.758136 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zsq9b" Dec 02 10:40:49 crc kubenswrapper[4781]: I1202 10:40:49.053939 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zsq9b" Dec 02 10:40:49 crc kubenswrapper[4781]: I1202 10:40:49.104868 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zsq9b"] Dec 02 10:40:51 crc kubenswrapper[4781]: I1202 10:40:51.025178 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zsq9b" podUID="6431ec9c-c651-4273-9c24-77c7767cf7fd" containerName="registry-server" containerID="cri-o://41d296389a00e8c25c0efb5dd84d4e3df9348ae5d8e7aa2e11e08a680b30e79b" gracePeriod=2 Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.004565 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zsq9b" Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.039330 4781 generic.go:334] "Generic (PLEG): container finished" podID="6431ec9c-c651-4273-9c24-77c7767cf7fd" containerID="41d296389a00e8c25c0efb5dd84d4e3df9348ae5d8e7aa2e11e08a680b30e79b" exitCode=0 Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.039378 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsq9b" event={"ID":"6431ec9c-c651-4273-9c24-77c7767cf7fd","Type":"ContainerDied","Data":"41d296389a00e8c25c0efb5dd84d4e3df9348ae5d8e7aa2e11e08a680b30e79b"} Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.039409 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsq9b" event={"ID":"6431ec9c-c651-4273-9c24-77c7767cf7fd","Type":"ContainerDied","Data":"92ad01765d20bf8d28cc5808ce66add2982f5d2efb1117563fb5679e15bebcef"} Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.039430 4781 scope.go:117] "RemoveContainer" containerID="41d296389a00e8c25c0efb5dd84d4e3df9348ae5d8e7aa2e11e08a680b30e79b" Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.039585 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zsq9b" Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.061915 4781 scope.go:117] "RemoveContainer" containerID="316943b381ca95d5cff9e2bbeca34b46f2417a79f95decb8f212923e119afe31" Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.089764 4781 scope.go:117] "RemoveContainer" containerID="66a315fa822c1642d3a0e405ea59d02baddb75c39a71768fc52f4713ae1b6d02" Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.131897 4781 scope.go:117] "RemoveContainer" containerID="41d296389a00e8c25c0efb5dd84d4e3df9348ae5d8e7aa2e11e08a680b30e79b" Dec 02 10:40:52 crc kubenswrapper[4781]: E1202 10:40:52.132384 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41d296389a00e8c25c0efb5dd84d4e3df9348ae5d8e7aa2e11e08a680b30e79b\": container with ID starting with 41d296389a00e8c25c0efb5dd84d4e3df9348ae5d8e7aa2e11e08a680b30e79b not found: ID does not exist" containerID="41d296389a00e8c25c0efb5dd84d4e3df9348ae5d8e7aa2e11e08a680b30e79b" Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.132427 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41d296389a00e8c25c0efb5dd84d4e3df9348ae5d8e7aa2e11e08a680b30e79b"} err="failed to get container status \"41d296389a00e8c25c0efb5dd84d4e3df9348ae5d8e7aa2e11e08a680b30e79b\": rpc error: code = NotFound desc = could not find container \"41d296389a00e8c25c0efb5dd84d4e3df9348ae5d8e7aa2e11e08a680b30e79b\": container with ID starting with 41d296389a00e8c25c0efb5dd84d4e3df9348ae5d8e7aa2e11e08a680b30e79b not found: ID does not exist" Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.132452 4781 scope.go:117] "RemoveContainer" containerID="316943b381ca95d5cff9e2bbeca34b46f2417a79f95decb8f212923e119afe31" Dec 02 10:40:52 crc kubenswrapper[4781]: E1202 10:40:52.132745 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"316943b381ca95d5cff9e2bbeca34b46f2417a79f95decb8f212923e119afe31\": container with ID starting with 316943b381ca95d5cff9e2bbeca34b46f2417a79f95decb8f212923e119afe31 not found: ID does not exist" containerID="316943b381ca95d5cff9e2bbeca34b46f2417a79f95decb8f212923e119afe31" Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.132774 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316943b381ca95d5cff9e2bbeca34b46f2417a79f95decb8f212923e119afe31"} err="failed to get container status \"316943b381ca95d5cff9e2bbeca34b46f2417a79f95decb8f212923e119afe31\": rpc error: code = NotFound desc = could not find container \"316943b381ca95d5cff9e2bbeca34b46f2417a79f95decb8f212923e119afe31\": container with ID starting with 316943b381ca95d5cff9e2bbeca34b46f2417a79f95decb8f212923e119afe31 not found: ID does not exist" Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.132793 4781 scope.go:117] "RemoveContainer" containerID="66a315fa822c1642d3a0e405ea59d02baddb75c39a71768fc52f4713ae1b6d02" Dec 02 10:40:52 crc kubenswrapper[4781]: E1202 10:40:52.133205 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66a315fa822c1642d3a0e405ea59d02baddb75c39a71768fc52f4713ae1b6d02\": container with ID starting with 66a315fa822c1642d3a0e405ea59d02baddb75c39a71768fc52f4713ae1b6d02 not found: ID does not exist" containerID="66a315fa822c1642d3a0e405ea59d02baddb75c39a71768fc52f4713ae1b6d02" Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.133264 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a315fa822c1642d3a0e405ea59d02baddb75c39a71768fc52f4713ae1b6d02"} err="failed to get container status \"66a315fa822c1642d3a0e405ea59d02baddb75c39a71768fc52f4713ae1b6d02\": rpc error: code = NotFound desc = could not find container \"66a315fa822c1642d3a0e405ea59d02baddb75c39a71768fc52f4713ae1b6d02\": container with ID starting with 66a315fa822c1642d3a0e405ea59d02baddb75c39a71768fc52f4713ae1b6d02 not found: ID does not exist" Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.161624 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6431ec9c-c651-4273-9c24-77c7767cf7fd-utilities\") pod \"6431ec9c-c651-4273-9c24-77c7767cf7fd\" (UID: \"6431ec9c-c651-4273-9c24-77c7767cf7fd\") " Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.161732 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6431ec9c-c651-4273-9c24-77c7767cf7fd-catalog-content\") pod \"6431ec9c-c651-4273-9c24-77c7767cf7fd\" (UID: \"6431ec9c-c651-4273-9c24-77c7767cf7fd\") " Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.161789 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg9hb\" (UniqueName: \"kubernetes.io/projected/6431ec9c-c651-4273-9c24-77c7767cf7fd-kube-api-access-zg9hb\") pod \"6431ec9c-c651-4273-9c24-77c7767cf7fd\" (UID: \"6431ec9c-c651-4273-9c24-77c7767cf7fd\") " Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.162625 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6431ec9c-c651-4273-9c24-77c7767cf7fd-utilities" (OuterVolumeSpecName: "utilities") pod "6431ec9c-c651-4273-9c24-77c7767cf7fd" (UID: "6431ec9c-c651-4273-9c24-77c7767cf7fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.168572 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6431ec9c-c651-4273-9c24-77c7767cf7fd-kube-api-access-zg9hb" (OuterVolumeSpecName: "kube-api-access-zg9hb") pod "6431ec9c-c651-4273-9c24-77c7767cf7fd" (UID: "6431ec9c-c651-4273-9c24-77c7767cf7fd"). InnerVolumeSpecName "kube-api-access-zg9hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.263975 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6431ec9c-c651-4273-9c24-77c7767cf7fd-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.264019 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg9hb\" (UniqueName: \"kubernetes.io/projected/6431ec9c-c651-4273-9c24-77c7767cf7fd-kube-api-access-zg9hb\") on node \"crc\" DevicePath \"\"" Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.314448 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6431ec9c-c651-4273-9c24-77c7767cf7fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6431ec9c-c651-4273-9c24-77c7767cf7fd" (UID: "6431ec9c-c651-4273-9c24-77c7767cf7fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.369498 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6431ec9c-c651-4273-9c24-77c7767cf7fd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.379953 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zsq9b"] Dec 02 10:40:52 crc kubenswrapper[4781]: I1202 10:40:52.391304 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zsq9b"] Dec 02 10:40:53 crc kubenswrapper[4781]: I1202 10:40:53.509324 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6431ec9c-c651-4273-9c24-77c7767cf7fd" path="/var/lib/kubelet/pods/6431ec9c-c651-4273-9c24-77c7767cf7fd/volumes"